datasetId
stringlengths
2
117
card
stringlengths
19
1.01M
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/b602fbf2
--- dataset_info: features: - name: result dtype: string - name: id dtype: int64 splits: - name: train num_bytes: 184 num_examples: 10 download_size: 1335 dataset_size: 184 --- # Dataset Card for "b602fbf2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
BrunoGR/Emo_support_11kBalanced
--- configs: - config_name: default data_files: - split: test path: data/test-* - split: train path: data/train-* - split: validation path: data/validation-* dataset_info: features: - name: texto dtype: string - name: etiqueta dtype: string splits: - name: test num_bytes: 152156 num_examples: 1309 - name: train num_bytes: 12765622 num_examples: 121708 - name: validation num_bytes: 253200 num_examples: 2200 download_size: 8063810 dataset_size: 13170978 --- # Dataset Card for "Emo_support_11kBalanced" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
sanagnos/processed_gpt_dataset_small
--- dataset_info: features: - name: input_ids sequence: int32 - name: attention_mask sequence: int8 - name: special_tokens_mask sequence: int8 splits: - name: train num_bytes: 11145751500.0 num_examples: 14289425 download_size: 3620004230 dataset_size: 11145751500.0 --- # Dataset Card for "processed_gpt_dataset_small" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
aswin1906/github-advisory-2019.csv
--- license: apache-2.0 ---
Chakshu/conversation_ender
--- license: mit task_categories: - text-classification language: - en tags: - Conversation size_categories: - n<1K pretty_name: Conversation Enders --- Conversation Ending Check
EleutherAI/quirky_subtraction_increment0_bob_hard
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* dataset_info: features: - name: alice_label dtype: bool - name: bob_label dtype: bool - name: difficulty dtype: int64 - name: statement dtype: string - name: choices sequence: string - name: character dtype: string - name: label dtype: bool splits: - name: train num_bytes: 3169094.7865260416 num_examples: 48047 - name: validation num_bytes: 66834.1945 num_examples: 1013 - name: test num_bytes: 66270.2025 num_examples: 1005 download_size: 1186212 dataset_size: 3302199.183526042 --- # Dataset Card for "quirky_subtraction_increment0_bob_hard" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ruliad/AutoMathText
--- license: cc-by-sa-4.0 task_categories: - text-generation - question-answering language: - en pretty_name: AutoMathText size_categories: - 10B<n<100B configs: - config_name: web-0.50-to-1.00 data_files: - split: train path: - data/web/0.95-1.00.jsonl - data/web/0.90-0.95.jsonl - data/web/0.85-0.90.jsonl - data/web/0.80-0.85.jsonl - data/web/0.75-0.80.jsonl - data/web/0.70-0.75.jsonl - data/web/0.65-0.70.jsonl - data/web/0.60-0.65.jsonl - data/web/0.55-0.60.jsonl - data/web/0.50-0.55.jsonl default: true - config_name: web-0.60-to-1.00 data_files: - split: train path: - data/web/0.95-1.00.jsonl - data/web/0.90-0.95.jsonl - data/web/0.85-0.90.jsonl - data/web/0.80-0.85.jsonl - data/web/0.75-0.80.jsonl - data/web/0.70-0.75.jsonl - data/web/0.65-0.70.jsonl - data/web/0.60-0.65.jsonl - config_name: web-0.70-to-1.00 data_files: - split: train path: - data/web/0.95-1.00.jsonl - data/web/0.90-0.95.jsonl - data/web/0.85-0.90.jsonl - data/web/0.80-0.85.jsonl - data/web/0.75-0.80.jsonl - data/web/0.70-0.75.jsonl - config_name: web-0.80-to-1.00 data_files: - split: train path: - data/web/0.95-1.00.jsonl - data/web/0.90-0.95.jsonl - data/web/0.85-0.90.jsonl - data/web/0.80-0.85.jsonl - config_name: web-full data_files: data/web/*.jsonl - config_name: arxiv-0.50-to-1.00 data_files: - split: train path: - data/arxiv/0.90-1.00/*.jsonl - data/arxiv/0.80-0.90/*.jsonl - data/arxiv/0.70-0.80/*.jsonl - data/arxiv/0.60-0.70/*.jsonl - data/arxiv/0.50-0.60/*.jsonl - config_name: arxiv-0.60-to-1.00 data_files: - split: train path: - data/arxiv/0.90-1.00/*.jsonl - data/arxiv/0.80-0.90/*.jsonl - data/arxiv/0.70-0.80/*.jsonl - data/arxiv/0.60-0.70/*.jsonl - config_name: arxiv-0.70-to-1.00 data_files: - split: train path: - data/arxiv/0.90-1.00/*.jsonl - data/arxiv/0.80-0.90/*.jsonl - data/arxiv/0.70-0.80/*.jsonl - config_name: arxiv-0.80-to-1.00 data_files: - split: train path: - data/arxiv/0.90-1.00/*.jsonl - data/arxiv/0.80-0.90/*.jsonl - config_name: arxiv-full data_files: - split: train path: - data/arxiv/0.90-1.00/*.jsonl - data/arxiv/0.80-0.90/*.jsonl - data/arxiv/0.70-0.80/*.jsonl - data/arxiv/0.60-0.70/*.jsonl - data/arxiv/0.50-0.60/*.jsonl - data/arxiv/0.00-0.50/*.jsonl - config_name: code-0.50-to-1.00 data_files: - split: train path: - data/code/agda/0.95-1.00.jsonl - data/code/agda/0.90-0.95.jsonl - data/code/agda/0.85-0.90.jsonl - data/code/agda/0.80-0.85.jsonl - data/code/agda/0.75-0.80.jsonl - data/code/agda/0.70-0.75.jsonl - data/code/agda/0.65-0.70.jsonl - data/code/agda/0.60-0.65.jsonl - data/code/agda/0.55-0.60.jsonl - data/code/agda/0.50-0.55.jsonl - data/code/c/0.95-1.00.jsonl - data/code/c/0.90-0.95.jsonl - data/code/c/0.85-0.90.jsonl - data/code/c/0.80-0.85.jsonl - data/code/c/0.75-0.80.jsonl - data/code/c/0.70-0.75.jsonl - data/code/c/0.65-0.70.jsonl - data/code/c/0.60-0.65.jsonl - data/code/c/0.55-0.60.jsonl - data/code/c/0.50-0.55.jsonl - data/code/cpp/0.95-1.00.jsonl - data/code/cpp/0.90-0.95.jsonl - data/code/cpp/0.85-0.90.jsonl - data/code/cpp/0.80-0.85.jsonl - data/code/cpp/0.75-0.80.jsonl - data/code/cpp/0.70-0.75.jsonl - data/code/cpp/0.65-0.70.jsonl - data/code/cpp/0.60-0.65.jsonl - data/code/cpp/0.55-0.60.jsonl - data/code/cpp/0.50-0.55.jsonl - data/code/fortran/0.95-1.00.jsonl - data/code/fortran/0.90-0.95.jsonl - data/code/fortran/0.85-0.90.jsonl - data/code/fortran/0.80-0.85.jsonl - data/code/fortran/0.75-0.80.jsonl - data/code/fortran/0.70-0.75.jsonl - data/code/fortran/0.65-0.70.jsonl - data/code/fortran/0.60-0.65.jsonl - data/code/fortran/0.55-0.60.jsonl - data/code/fortran/0.50-0.55.jsonl - data/code/gap/0.95-1.00.jsonl - data/code/gap/0.90-0.95.jsonl - data/code/gap/0.85-0.90.jsonl - data/code/gap/0.80-0.85.jsonl - data/code/gap/0.75-0.80.jsonl - data/code/gap/0.70-0.75.jsonl - data/code/gap/0.65-0.70.jsonl - data/code/gap/0.60-0.65.jsonl - data/code/gap/0.55-0.60.jsonl - data/code/gap/0.50-0.55.jsonl - data/code/github-coq-train/0.95-1.00.jsonl - data/code/github-coq-train/0.90-0.95.jsonl - data/code/github-coq-train/0.85-0.90.jsonl - data/code/github-coq-train/0.80-0.85.jsonl - data/code/github-coq-train/0.75-0.80.jsonl - data/code/github-coq-train/0.70-0.75.jsonl - data/code/github-coq-train/0.65-0.70.jsonl - data/code/github-coq-train/0.60-0.65.jsonl - data/code/github-coq-train/0.55-0.60.jsonl - data/code/github-coq-train/0.50-0.55.jsonl - data/code/github-isabelle-train/0.95-1.00.jsonl - data/code/github-isabelle-train/0.90-0.95.jsonl - data/code/github-isabelle-train/0.85-0.90.jsonl - data/code/github-isabelle-train/0.80-0.85.jsonl - data/code/github-isabelle-train/0.75-0.80.jsonl - data/code/github-isabelle-train/0.70-0.75.jsonl - data/code/github-isabelle-train/0.65-0.70.jsonl - data/code/github-isabelle-train/0.60-0.65.jsonl - data/code/github-isabelle-train/0.55-0.60.jsonl - data/code/github-isabelle-train/0.50-0.55.jsonl - data/code/github-lean-train/0.95-1.00.jsonl - data/code/github-lean-train/0.90-0.95.jsonl - data/code/github-lean-train/0.85-0.90.jsonl - data/code/github-lean-train/0.80-0.85.jsonl - data/code/github-lean-train/0.75-0.80.jsonl - data/code/github-lean-train/0.70-0.75.jsonl - data/code/github-lean-train/0.65-0.70.jsonl - data/code/github-lean-train/0.60-0.65.jsonl - data/code/github-lean-train/0.55-0.60.jsonl - data/code/github-lean-train/0.50-0.55.jsonl - data/code/github-MATLAB-train/0.95-1.00.jsonl - data/code/github-MATLAB-train/0.90-0.95.jsonl - data/code/github-MATLAB-train/0.85-0.90.jsonl - data/code/github-MATLAB-train/0.80-0.85.jsonl - data/code/github-MATLAB-train/0.75-0.80.jsonl - data/code/github-MATLAB-train/0.70-0.75.jsonl - data/code/github-MATLAB-train/0.65-0.70.jsonl - data/code/github-MATLAB-train/0.60-0.65.jsonl - data/code/github-MATLAB-train/0.55-0.60.jsonl - data/code/github-MATLAB-train/0.50-0.55.jsonl - data/code/haskell/0.95-1.00.jsonl - data/code/haskell/0.90-0.95.jsonl - data/code/haskell/0.85-0.90.jsonl - data/code/haskell/0.80-0.85.jsonl - data/code/haskell/0.75-0.80.jsonl - data/code/haskell/0.70-0.75.jsonl - data/code/haskell/0.65-0.70.jsonl - data/code/haskell/0.60-0.65.jsonl - data/code/haskell/0.55-0.60.jsonl - data/code/haskell/0.50-0.55.jsonl - data/code/idris/0.95-1.00.jsonl - data/code/idris/0.90-0.95.jsonl - data/code/idris/0.85-0.90.jsonl - data/code/idris/0.80-0.85.jsonl - data/code/idris/0.75-0.80.jsonl - data/code/idris/0.70-0.75.jsonl - data/code/idris/0.65-0.70.jsonl - data/code/idris/0.60-0.65.jsonl - data/code/idris/0.55-0.60.jsonl - data/code/idris/0.50-0.55.jsonl - data/code/isa_proofsteps/0.95-1.00.jsonl - data/code/isa_proofsteps/0.90-0.95.jsonl - data/code/isa_proofsteps/0.85-0.90.jsonl - data/code/isa_proofsteps/0.80-0.85.jsonl - data/code/isa_proofsteps/0.75-0.80.jsonl - data/code/isa_proofsteps/0.70-0.75.jsonl - data/code/isa_proofsteps/0.65-0.70.jsonl - data/code/isa_proofsteps/0.60-0.65.jsonl - data/code/isa_proofsteps/0.55-0.60.jsonl - data/code/isa_proofsteps/0.50-0.55.jsonl - data/code/julia/0.95-1.00.jsonl - data/code/julia/0.90-0.95.jsonl - data/code/julia/0.85-0.90.jsonl - data/code/julia/0.80-0.85.jsonl - data/code/julia/0.75-0.80.jsonl - data/code/julia/0.70-0.75.jsonl - data/code/julia/0.65-0.70.jsonl - data/code/julia/0.60-0.65.jsonl - data/code/julia/0.55-0.60.jsonl - data/code/julia/0.50-0.55.jsonl - data/code/jupyter-notebook/0.95-1.00.jsonl - data/code/jupyter-notebook/0.90-0.95.jsonl - data/code/jupyter-notebook/0.85-0.90.jsonl - data/code/jupyter-notebook/0.80-0.85.jsonl - data/code/jupyter-notebook/0.75-0.80.jsonl - data/code/jupyter-notebook/0.70-0.75.jsonl - data/code/jupyter-notebook/0.65-0.70.jsonl - data/code/jupyter-notebook/0.60-0.65.jsonl - data/code/jupyter-notebook/0.55-0.60.jsonl - data/code/jupyter-notebook/0.50-0.55.jsonl - data/code/lean_proofsteps/0.95-1.00.jsonl - data/code/lean_proofsteps/0.90-0.95.jsonl - data/code/lean_proofsteps/0.85-0.90.jsonl - data/code/lean_proofsteps/0.80-0.85.jsonl - data/code/lean_proofsteps/0.75-0.80.jsonl - data/code/lean_proofsteps/0.70-0.75.jsonl - data/code/lean_proofsteps/0.65-0.70.jsonl - data/code/lean_proofsteps/0.60-0.65.jsonl - data/code/lean_proofsteps/0.55-0.60.jsonl - data/code/lean_proofsteps/0.50-0.55.jsonl - data/code/maple/0.95-1.00.jsonl - data/code/maple/0.90-0.95.jsonl - data/code/maple/0.85-0.90.jsonl - data/code/maple/0.80-0.85.jsonl - data/code/maple/0.75-0.80.jsonl - data/code/maple/0.70-0.75.jsonl - data/code/maple/0.65-0.70.jsonl - data/code/maple/0.60-0.65.jsonl - data/code/maple/0.55-0.60.jsonl - data/code/maple/0.50-0.55.jsonl - data/code/python/0.95-1.00.jsonl - data/code/python/0.90-0.95.jsonl - data/code/python/0.85-0.90.jsonl - data/code/python/0.80-0.85.jsonl - data/code/python/0.75-0.80.jsonl - data/code/python/0.70-0.75.jsonl - data/code/python/0.65-0.70.jsonl - data/code/python/0.60-0.65.jsonl - data/code/python/0.55-0.60.jsonl - data/code/python/0.50-0.55.jsonl - data/code/r/0.95-1.00.jsonl - data/code/r/0.90-0.95.jsonl - data/code/r/0.85-0.90.jsonl - data/code/r/0.80-0.85.jsonl - data/code/r/0.75-0.80.jsonl - data/code/r/0.70-0.75.jsonl - data/code/r/0.65-0.70.jsonl - data/code/r/0.60-0.65.jsonl - data/code/r/0.55-0.60.jsonl - data/code/r/0.50-0.55.jsonl - data/code/tex/0.95-1.00.jsonl - data/code/tex/0.90-0.95.jsonl - data/code/tex/0.85-0.90.jsonl - data/code/tex/0.80-0.85.jsonl - data/code/tex/0.75-0.80.jsonl - data/code/tex/0.70-0.75.jsonl - data/code/tex/0.65-0.70.jsonl - data/code/tex/0.60-0.65.jsonl - data/code/tex/0.55-0.60.jsonl - data/code/tex/0.50-0.55.jsonl - config_name: code-python-0.50-to-1.00 data_files: - split: train path: - data/code/python/0.95-1.00.jsonl - data/code/python/0.90-0.95.jsonl - data/code/python/0.85-0.90.jsonl - data/code/python/0.80-0.85.jsonl - data/code/python/0.75-0.80.jsonl - data/code/python/0.70-0.75.jsonl - data/code/python/0.65-0.70.jsonl - data/code/python/0.60-0.65.jsonl - data/code/python/0.55-0.60.jsonl - data/code/python/0.50-0.55.jsonl - config_name: code-python-0.60-to-1.00 data_files: - split: train path: - data/code/python/0.95-1.00.jsonl - data/code/python/0.90-0.95.jsonl - data/code/python/0.85-0.90.jsonl - data/code/python/0.80-0.85.jsonl - data/code/python/0.75-0.80.jsonl - data/code/python/0.70-0.75.jsonl - data/code/python/0.65-0.70.jsonl - data/code/python/0.60-0.65.jsonl - config_name: code-python-0.70-to-1.00 data_files: - split: train path: - data/code/python/0.95-1.00.jsonl - data/code/python/0.90-0.95.jsonl - data/code/python/0.85-0.90.jsonl - data/code/python/0.80-0.85.jsonl - data/code/python/0.75-0.80.jsonl - data/code/python/0.70-0.75.jsonl - config_name: code-python-0.80-to-1.00 data_files: - split: train path: - data/code/python/0.95-1.00.jsonl - data/code/python/0.90-0.95.jsonl - data/code/python/0.85-0.90.jsonl - data/code/python/0.80-0.85.jsonl - config_name: code-jupyter-notebook-0.50-to-1.00 data_files: - split: train path: - data/code/jupyter-notebook/0.95-1.00.jsonl - data/code/jupyter-notebook/0.90-0.95.jsonl - data/code/jupyter-notebook/0.85-0.90.jsonl - data/code/jupyter-notebook/0.80-0.85.jsonl - data/code/jupyter-notebook/0.75-0.80.jsonl - data/code/jupyter-notebook/0.70-0.75.jsonl - data/code/jupyter-notebook/0.65-0.70.jsonl - data/code/jupyter-notebook/0.60-0.65.jsonl - data/code/jupyter-notebook/0.55-0.60.jsonl - data/code/jupyter-notebook/0.50-0.55.jsonl - config_name: code-jupyter-notebook-0.60-to-1.00 data_files: - split: train path: - data/code/jupyter-notebook/0.95-1.00.jsonl - data/code/jupyter-notebook/0.90-0.95.jsonl - data/code/jupyter-notebook/0.85-0.90.jsonl - data/code/jupyter-notebook/0.80-0.85.jsonl - data/code/jupyter-notebook/0.75-0.80.jsonl - data/code/jupyter-notebook/0.70-0.75.jsonl - data/code/jupyter-notebook/0.65-0.70.jsonl - data/code/jupyter-notebook/0.60-0.65.jsonl - config_name: code-jupyter-notebook-0.70-to-1.00 data_files: - split: train path: - data/code/jupyter-notebook/0.95-1.00.jsonl - data/code/jupyter-notebook/0.90-0.95.jsonl - data/code/jupyter-notebook/0.85-0.90.jsonl - data/code/jupyter-notebook/0.80-0.85.jsonl - data/code/jupyter-notebook/0.75-0.80.jsonl - data/code/jupyter-notebook/0.70-0.75.jsonl - config_name: code-jupyter-notebook-0.80-to-1.00 data_files: - split: train path: - data/code/jupyter-notebook/0.95-1.00.jsonl - data/code/jupyter-notebook/0.90-0.95.jsonl - data/code/jupyter-notebook/0.85-0.90.jsonl - data/code/jupyter-notebook/0.80-0.85.jsonl - config_name: code-full data_files: - split: train path: - data/code/*/*.jsonl tags: - mathematical-reasoning - reasoning - finetuning - pretraining - llm --- # AutoMathText **AutoMathText** is an extensive and carefully curated dataset encompassing around **200 GB** of mathematical texts. It's a compilation sourced from a diverse range of platforms including various websites, arXiv, and GitHub (OpenWebMath, RedPajama, Algebraic Stack). This rich repository has been **autonomously selected (labeled) by the state-of-the-art open-source language model**, Qwen-72B. Each piece of content in the dataset is assigned **a score `lm_q1q2_score` within the range of [0, 1]**, reflecting its relevance, quality and educational value in the context of mathematical intelligence. GitHub homepage: https://github.com/yifanzhang-pro/AutoMathText ArXiv paper: https://arxiv.org/abs/2402.07625 ## Objective The primary aim of the **AutoMathText** dataset is to provide a comprehensive and reliable resource for a wide array of users - from academic researchers and educators to AI practitioners and mathematics enthusiasts. This dataset is particularly geared towards: - Facilitating advanced research in **the intersection of mathematics and artificial intelligence**. - Serving as an educational tool for **learning and teaching complex mathematical concepts**. - Providing **a foundation for developing and training AI models** specialized in processing and understanding **mathematical content**. ## Configs ```YAML configs: - config_name: web-0.50-to-1.00 data_files: - split: train path: - data/web/0.95-1.00.jsonl - data/web/0.90-0.95.jsonl - ... - data/web/0.50-0.55.jsonl default: true - config_name: web-0.60-to-1.00 - config_name: web-0.70-to-1.00 - config_name: web-0.80-to-1.00 - config_name: web-full data_files: data/web/*.jsonl - config_name: arxiv-0.50-to-1.00 data_files: - split: train path: - data/arxiv/0.90-1.00/*.jsonl - ... - data/arxiv/0.50-0.60/*.jsonl - config_name: arxiv-0.60-to-1.00 - config_name: arxiv-0.70-to-1.00 - config_name: arxiv-0.80-to-1.00 - config_name: arxiv-full data_files: data/arxiv/*/*.jsonl - config_name: code-0.50-to-1.00 data_files: - split: train path: - data/code/*/0.95-1.00.jsonl - ... - data/code/*/0.50-0.55.jsonl - config_name: code-python-0.50-to-1.00 - split: train path: - data/code/python/0.95-1.00.jsonl - ... - data/code/python/0.50-0.55.jsonl - config_name: code-python-0.60-to-1.00 - config_name: code-python-0.70-to-1.00 - config_name: code-python-0.80-to-1.00 - config_name: code-jupyter-notebook-0.50-to-1.00 - split: train path: - data/code/jupyter-notebook/0.95-1.00.jsonl - ... - data/code/jupyter-notebook/0.50-0.55.jsonl - config_name: code-jupyter-notebook-0.60-to-1.00 - config_name: code-jupyter-notebook-0.70-to-1.00 - config_name: code-jupyter-notebook-0.80-to-1.00 - config_name: code-full data_files: data/code/*/*.jsonl ``` How to load data: ```python from datasets import load_dataset ds = load_dataset("math-ai/AutoMathText", "web-0.50-to-1.00") # or any valid config_name ``` ## Features - **Volume**: Approximately 200 GB of text data (in natural language and programming language). - **Content**: A diverse collection of mathematical texts, including but not limited to research papers, educational articles, and code documentation. - **Labeling**: Every text is **scored** by Qwen-72B, a sophisticated language model, ensuring a high standard of relevance and accuracy. - **Scope**: Covers a wide spectrum of mathematical topics, making it suitable for various applications in advanced research and education. ## References - OpenWebMath [[link]](https://huggingface.co/datasets/open-web-math/open-web-math) - RedPajama [[link]](https://huggingface.co/datasets/togethercomputer/RedPajama-Data-1T) - Algebraick Stack [[link]](https://huggingface.co/datasets/EleutherAI/proof-pile-2) (a subset of Proof-Pile-2) ## Citation We appreciate your use of **AutoMathText** in your work. If you find this repository helpful, please consider citing it and star this repo. Feel free to contact zhangyif21@tsinghua.edu.cn or open an issue if you have any questions (GitHub homepage: https://github.com/yifanzhang-pro/AutoMathText). ```bibtex @article{zhang2024automathtext, title={AutoMathText: Autonomous Data Selection with Language Models for Mathematical Texts}, author={Zhang, Yifan and Luo, Yifan and Yuan, Yang and Yao, Andrew Chi-Chih}, journal={arXiv preprint arXiv:2402.07625}, year={2024}, } ```
maconphillips/Warren-VT-Info
--- license: mit ---
liuyanchen1015/VALUE_rte_negative_inversion
--- dataset_info: features: - name: sentence1 dtype: string - name: sentence2 dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: test num_bytes: 286 num_examples: 1 - name: train num_bytes: 794 num_examples: 1 download_size: 10857 dataset_size: 1080 --- # Dataset Card for "VALUE_rte_negative_inversion" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
BangumiBase/tenseishitarakendeshita
--- license: mit tags: - art size_categories: - n<1K --- # Bangumi Image Base of Tensei Shitara Ken Deshita This is the image base of bangumi Tensei Shitara Ken Deshita, we detected 20 characters, 895 images in total. The full dataset is [here](all.zip). **Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview: | # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 | |:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------| | 0 | 277 | [Download](0/dataset.zip) | ![preview 1](0/preview_1.png) | ![preview 2](0/preview_2.png) | ![preview 3](0/preview_3.png) | ![preview 4](0/preview_4.png) | ![preview 5](0/preview_5.png) | ![preview 6](0/preview_6.png) | ![preview 7](0/preview_7.png) | ![preview 8](0/preview_8.png) | | 1 | 110 | [Download](1/dataset.zip) | ![preview 1](1/preview_1.png) | ![preview 2](1/preview_2.png) | ![preview 3](1/preview_3.png) | ![preview 4](1/preview_4.png) | ![preview 5](1/preview_5.png) | ![preview 6](1/preview_6.png) | ![preview 7](1/preview_7.png) | ![preview 8](1/preview_8.png) | | 2 | 22 | [Download](2/dataset.zip) | ![preview 1](2/preview_1.png) | ![preview 2](2/preview_2.png) | ![preview 3](2/preview_3.png) | ![preview 4](2/preview_4.png) | ![preview 5](2/preview_5.png) | ![preview 6](2/preview_6.png) | ![preview 7](2/preview_7.png) | ![preview 8](2/preview_8.png) | | 3 | 49 | [Download](3/dataset.zip) | ![preview 1](3/preview_1.png) | ![preview 2](3/preview_2.png) | ![preview 3](3/preview_3.png) | ![preview 4](3/preview_4.png) | ![preview 5](3/preview_5.png) | ![preview 6](3/preview_6.png) | ![preview 7](3/preview_7.png) | ![preview 8](3/preview_8.png) | | 4 | 32 | [Download](4/dataset.zip) | ![preview 1](4/preview_1.png) | ![preview 2](4/preview_2.png) | ![preview 3](4/preview_3.png) | ![preview 4](4/preview_4.png) | ![preview 5](4/preview_5.png) | ![preview 6](4/preview_6.png) | ![preview 7](4/preview_7.png) | ![preview 8](4/preview_8.png) | | 5 | 19 | [Download](5/dataset.zip) | ![preview 1](5/preview_1.png) | ![preview 2](5/preview_2.png) | ![preview 3](5/preview_3.png) | ![preview 4](5/preview_4.png) | ![preview 5](5/preview_5.png) | ![preview 6](5/preview_6.png) | ![preview 7](5/preview_7.png) | ![preview 8](5/preview_8.png) | | 6 | 25 | [Download](6/dataset.zip) | ![preview 1](6/preview_1.png) | ![preview 2](6/preview_2.png) | ![preview 3](6/preview_3.png) | ![preview 4](6/preview_4.png) | ![preview 5](6/preview_5.png) | ![preview 6](6/preview_6.png) | ![preview 7](6/preview_7.png) | ![preview 8](6/preview_8.png) | | 7 | 7 | [Download](7/dataset.zip) | ![preview 1](7/preview_1.png) | ![preview 2](7/preview_2.png) | ![preview 3](7/preview_3.png) | ![preview 4](7/preview_4.png) | ![preview 5](7/preview_5.png) | ![preview 6](7/preview_6.png) | ![preview 7](7/preview_7.png) | N/A | | 8 | 10 | [Download](8/dataset.zip) | ![preview 1](8/preview_1.png) | ![preview 2](8/preview_2.png) | ![preview 3](8/preview_3.png) | ![preview 4](8/preview_4.png) | ![preview 5](8/preview_5.png) | ![preview 6](8/preview_6.png) | ![preview 7](8/preview_7.png) | ![preview 8](8/preview_8.png) | | 9 | 9 | [Download](9/dataset.zip) | ![preview 1](9/preview_1.png) | ![preview 2](9/preview_2.png) | ![preview 3](9/preview_3.png) | ![preview 4](9/preview_4.png) | ![preview 5](9/preview_5.png) | ![preview 6](9/preview_6.png) | ![preview 7](9/preview_7.png) | ![preview 8](9/preview_8.png) | | 10 | 14 | [Download](10/dataset.zip) | ![preview 1](10/preview_1.png) | ![preview 2](10/preview_2.png) | ![preview 3](10/preview_3.png) | ![preview 4](10/preview_4.png) | ![preview 5](10/preview_5.png) | ![preview 6](10/preview_6.png) | ![preview 7](10/preview_7.png) | ![preview 8](10/preview_8.png) | | 11 | 99 | [Download](11/dataset.zip) | ![preview 1](11/preview_1.png) | ![preview 2](11/preview_2.png) | ![preview 3](11/preview_3.png) | ![preview 4](11/preview_4.png) | ![preview 5](11/preview_5.png) | ![preview 6](11/preview_6.png) | ![preview 7](11/preview_7.png) | ![preview 8](11/preview_8.png) | | 12 | 9 | [Download](12/dataset.zip) | ![preview 1](12/preview_1.png) | ![preview 2](12/preview_2.png) | ![preview 3](12/preview_3.png) | ![preview 4](12/preview_4.png) | ![preview 5](12/preview_5.png) | ![preview 6](12/preview_6.png) | ![preview 7](12/preview_7.png) | ![preview 8](12/preview_8.png) | | 13 | 24 | [Download](13/dataset.zip) | ![preview 1](13/preview_1.png) | ![preview 2](13/preview_2.png) | ![preview 3](13/preview_3.png) | ![preview 4](13/preview_4.png) | ![preview 5](13/preview_5.png) | ![preview 6](13/preview_6.png) | ![preview 7](13/preview_7.png) | ![preview 8](13/preview_8.png) | | 14 | 34 | [Download](14/dataset.zip) | ![preview 1](14/preview_1.png) | ![preview 2](14/preview_2.png) | ![preview 3](14/preview_3.png) | ![preview 4](14/preview_4.png) | ![preview 5](14/preview_5.png) | ![preview 6](14/preview_6.png) | ![preview 7](14/preview_7.png) | ![preview 8](14/preview_8.png) | | 15 | 14 | [Download](15/dataset.zip) | ![preview 1](15/preview_1.png) | ![preview 2](15/preview_2.png) | ![preview 3](15/preview_3.png) | ![preview 4](15/preview_4.png) | ![preview 5](15/preview_5.png) | ![preview 6](15/preview_6.png) | ![preview 7](15/preview_7.png) | ![preview 8](15/preview_8.png) | | 16 | 13 | [Download](16/dataset.zip) | ![preview 1](16/preview_1.png) | ![preview 2](16/preview_2.png) | ![preview 3](16/preview_3.png) | ![preview 4](16/preview_4.png) | ![preview 5](16/preview_5.png) | ![preview 6](16/preview_6.png) | ![preview 7](16/preview_7.png) | ![preview 8](16/preview_8.png) | | 17 | 9 | [Download](17/dataset.zip) | ![preview 1](17/preview_1.png) | ![preview 2](17/preview_2.png) | ![preview 3](17/preview_3.png) | ![preview 4](17/preview_4.png) | ![preview 5](17/preview_5.png) | ![preview 6](17/preview_6.png) | ![preview 7](17/preview_7.png) | ![preview 8](17/preview_8.png) | | 18 | 5 | [Download](18/dataset.zip) | ![preview 1](18/preview_1.png) | ![preview 2](18/preview_2.png) | ![preview 3](18/preview_3.png) | ![preview 4](18/preview_4.png) | ![preview 5](18/preview_5.png) | N/A | N/A | N/A | | noise | 114 | [Download](-1/dataset.zip) | ![preview 1](-1/preview_1.png) | ![preview 2](-1/preview_2.png) | ![preview 3](-1/preview_3.png) | ![preview 4](-1/preview_4.png) | ![preview 5](-1/preview_5.png) | ![preview 6](-1/preview_6.png) | ![preview 7](-1/preview_7.png) | ![preview 8](-1/preview_8.png) |
hippocrates/clinical_train
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: valid path: data/valid-* dataset_info: features: - name: id dtype: string - name: conversations list: - name: from dtype: string - name: value dtype: string - name: text dtype: string splits: - name: train num_bytes: 52143984 num_examples: 10000 - name: valid num_bytes: 46961044 num_examples: 8164 download_size: 42209790 dataset_size: 99105028 --- # Dataset Card for "clinical_train" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Deojoandco/capstone_fromgpt_without_gold_v5
--- dataset_info: features: - name: dialog_id dtype: int64 - name: dialogue dtype: string - name: summary dtype: string - name: gold_tags dtype: string - name: gpt_success dtype: bool - name: gpt_response dtype: string - name: gold_tags_tokens_count dtype: int64 - name: GPT_TAGS_FOUND dtype: bool - name: gpt_output_tags dtype: string - name: gpt_output_tag_tokens_count dtype: int64 - name: GPT_MI_FOUND dtype: bool - name: gpt_tags_token_count dtype: int64 - name: gpt_tags dtype: string - name: tag_token_count_match dtype: bool splits: - name: test num_bytes: 19859 num_examples: 12 download_size: 21426 dataset_size: 19859 configs: - config_name: default data_files: - split: test path: data/test-* --- # Dataset Card for "capstone_fromgpt_without_gold_v5" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CyberHarem/a_545_girlsfrontline
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of a_545/A-545/A-545 (Girls' Frontline) This is the dataset of a_545/A-545/A-545 (Girls' Frontline), containing 20 images and their tags. The core tags of this character are `long_hair, bangs, breasts, blonde_hair, braid, twintails, medium_breasts, hat, blue_eyes, aqua_eyes, beret, black_headwear, very_long_hair, braided_bangs, hair_ornament, hairclip`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 20 | 36.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_545_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 20 | 17.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_545_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 48 | 34.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_545_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 20 | 29.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_545_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 48 | 54.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_545_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/a_545_girlsfrontline', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, simple_background, white_background, assault_rifle, black_footwear, bodysuit, black_gloves, closed_mouth, smile, black_thighhighs, dress, alcohol, holding_bottle, full_body, high_heel_boots, holding_gun, sitting, thigh_boots | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | simple_background | white_background | assault_rifle | black_footwear | bodysuit | black_gloves | closed_mouth | smile | black_thighhighs | dress | alcohol | holding_bottle | full_body | high_heel_boots | holding_gun | sitting | thigh_boots | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------------------|:-------------------|:----------------|:-----------------|:-----------|:---------------|:---------------|:--------|:-------------------|:--------|:----------|:-----------------|:------------|:------------------|:--------------|:----------|:--------------| | 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
nlpso/m0_qualitative_analysis_ocr_ptrn_cmbert_io
--- language: - fr multilinguality: - monolingual task_categories: - token-classification --- # m0_qualitative_analysis_ocr_ptrn_cmbert_io ## Introduction This dataset was used to perform **qualitative analysis** of [HueyNemud/das22-10-camembert_pretrained](https://huggingface.co/HueyNemud/das22-10-camembert_pretrained) on **flat NER task** using Flat NER approach [M0]. It contains 19th-century Paris trade directories' entries. ## Dataset parameters * Approach : M0 * Dataset type : noisy (Pero OCR) * Tokenizer : [HueyNemud/das22-10-camembert_pretrained](https://huggingface.co/HueyNemud/das22-10-camembert_pretrained) * Tagging format : IO * Counts : * Train : 6084 * Dev : 676 * Test : 1685 * Associated fine-tuned model : [nlpso/m0_flat_ner_ocr_ptrn_cmbert_io](https://huggingface.co/nlpso/m0_flat_ner_ocr_ptrn_cmbert_io) ## Entity types Abbreviation|Description -|- O |Outside of a named entity PER |Person or company name ACT |Person or company professional activity TITRE |Distinction LOC |Street name CARDINAL |Street number FT |Geographical feature ## How to use this dataset ```python from datasets import load_dataset train_dev_test = load_dataset("nlpso/m0_qualitative_analysis_ocr_ptrn_cmbert_io")
TaMduluza/Fire_detection_v1
--- license: eupl-1.1 ---
openbmb/UltraInteract_pair
--- language: - en license: mit pretty_name: UltraInteract_pair configs: - config_name: default data_files: - split: train path: 0000_pair.parquet dataset_info: features: - name: task dtype: string - name: dataset dtype: string - name: trajectory list: - name: from dtype: string - name: value dtype: string - name: chosen dtype: string - name: rejected dtype: string - name: id dtype: string - name: parent_id dtype: string splits: - name: train num_bytes: 1144517 num_examples: 219522 download_size: 1144517 dataset_size: 1144517 --- ## Introduction - 📜 [Paper](https://arxiv.org/abs/2404.02078) - 🤗 [Eurus Collection](https://huggingface.co/collections/openbmb/eurus-660bc40bec5376b3adc9d1c5) - 🤗 UltraInteract - [SFT](https://huggingface.co/datasets/openbmb/UltraInteract_sft) - [Preference Learning](https://huggingface.co/datasets/openbmb/UltraInteract_pair) - [GitHub Repo](https://github.com/OpenBMB/Eurus) UltraInteract is a large-scale, high-quality alignment dataset specifically designed for complex reasoning tasks. For each instruction, it includes a preference tree consisting of - (1) reasoning chains with diverse planning strategies in a unified format - (2) multi-turn interaction trajectories with the environment and the critique - (3) pairwise data to facilitate preference learning ## Structure UltraInteract collects a preference tree for each instruction, with the instruction being the root and each action a node. A trajectory is a root-to-leaf path consisting of a sequence of actions. In each preference tree, all nodes of correct actions and all trajectories ending with correct actions can be used for SFT. Paired correct and incorrect nodes or trajectories can be used for preference learning. <img src="./figures/tree.png" alt="tree" style="zoom: 20%;" /> ## Illustrative Example Here is an illustrative example of an UltraInteract trajectory over two turns. In each turn, the actor model generates step-by-step reasoning chains, and the environment and the critique model provide observations and textual critique respectively. <img src="./figures/ui_example.png" alt="ui_example" style="zoom: 25%;" /> ## Stats Below are some statistics about UltraInteract. It consists of 86k instructions, 286k correct answers, and 219k pairs. <img src="./figures/stats.png" alt="stats" style="zoom: 40%;" /> ## Dataset Format (Preference Learning) ```JSONC { "id": "d1f094e0e13096bc1bb6260e1e0d807e2c4ebfbbf4a4373f04cbed001c385c3c_5_3", // unique data id, contains three parts, the unique string, the depth of the preference tree, and current turn, separated by '_' "parent_id": "d1f094e0e13096bc1bb6260e1e0d807e2c4ebfbbf4a4373f04cbed001c385c3c", // data with the same parent_id have the same instruction but different actions. "task": "Math_PoT", // task type, choices = ["Coding", "Math_Cot", "Math_PoT", "Logic"] "dataset": "MATH", // the dataset where the instruction comes from "trajectory": [ { "from": "user", "value": "Tool available:\n[1] Python interpreter\nWhen you send a message containing Python code to python, it will be executed in a stateful Jupyter notebook environment.\nSolve the following math problem step-by-step.\nSimplify your answer as much as possible.\nSimplify $(-3-2i)-(1+4i)$." // the task instruction }, { "from": "assistant", "value": "Step 1: To simplify the expression $(-3-2i)-(1+4i)$, we need to distribute the negative sign to both terms inside the parentheses.\nStep 2: We can then combine like terms by adding the real parts and the imaginary parts separately.\n```python\n# Step 3: Distribute the negative sign to both terms inside the parentheses.\n(-3) - (-1) + (-2i) - (-4i)\n# Step 4: Combine like terms by adding the real parts and the imaginary parts separately.\n-3 + 1 + (-2i + 4i)\n```" // the assistant's action (turn 1) }, { "from": "user", "value": "Out: Cell In[1], line 2\n (-3) - (-1) + (-2i) - (-4i)\n ^\nSyntaxError: invalid syntax\nYour answer is wrong.\nYour thought process for simplifying the given complex numbers is correct. However, you made a syntax error in the code. In Python, the multiplication operation must be explicitly written out with an asterisk (*), even for complex numbers. Remember, the imaginary unit in Python is denoted as j and not i. Try correcting the syntax and calculate again." // the observation from python interpreter and feedback (turn 1) }, { "from": "assistant", "value": "Step 1: To simplify the expression $(-3-2i)-(1+4i)$, we need to distribute the negative sign to both terms inside the parentheses.\nStep 2: We can then combine like terms by adding the real parts and the imaginary parts separately.\n```python\n# Step 3: Distribute the negative sign to both terms inside the parentheses.\n(-3) - (-1) + (-2j) - (-4j)\n# Step 4: Combine like terms by adding the real parts and the imaginary parts separately.\n-3 + 1 + (-2j + 4j)\n```" // the assistant's action acording to the observation and feedback.(turn 2) }, { "from": "user", "value": "Out: (-2+2j)\nYour answer is wrong.\nYour thought process for simplifying the given complex numbers is on the right track. However, you made an arithmetic error when adding the real parts (-3 + 1 ≠ -2) and the imaginary parts (-2j + 4j ≠ 2j). You should have combined the like terms correctly to get the right answer -4 and -6j respectively. Please try again with the correct arithmetic calculations." // the observation from python interpreter and feedback. (turn 2) } ] // trajectory is the interaction between user and assistant, and the first element is the task instruction. "chosen": "Step 1: Split the expression into real and imaginary terms.\nStep 2: Subtract the real terms from each other.\nStep 3: Subtract the imaginary terms from each other.\nStep 4: Combine the results of the subtraction to get the simplified form of the expression.\n\n\n\n```python\n# Step 5: Split the expression into real and imaginary parts and then subtract them.\nreal_part = -3 - 1 # Subtract the real parts\nimaginary_part = -2 - 4 # Subtract the imaginary parts\n# Step 6: Combine the real and imaginary parts to get the result.\nresult = f\"{real_part}{imaginary_part:+d}i\"\nprint(result)\n```", // the chosen answer based on the trajectory. "rejected": "Step 1: To simplify the expression $(-3-2i)-(1+4i)$, we need to distribute the negative sign to both terms inside the parentheses.\nStep 2: We can then combine like terms by adding the real parts and the imaginary parts separately.\n```python\n# Step 3: Distribute the negative sign to both terms inside the parentheses.\n(-3) - 1 - (-2j) - 4j\n# Step 4: Combine like terms by adding the real parts and the imaginary parts separately.\n-3 - 1 - 2j - 4j\n```", // the rejected answer based on the trajectory. } ``` ## Citation ```bib @misc{yuan2024advancing, title={Advancing LLM Reasoning Generalists with Preference Trees}, author={Lifan Yuan and Ganqu Cui and Hanbin Wang and Ning Ding and Xingyao Wang and Jia Deng and Boji Shan and Huimin Chen and Ruobing Xie and Yankai Lin and Zhenghao Liu and Bowen Zhou and Hao Peng and Zhiyuan Liu and Maosong Sun}, year={2024}, eprint={2404.02078}, archivePrefix={arXiv}, primaryClass={cs.AI} } ```
krisha05/story-generation-dataset
--- dataset_info: features: - name: instruction dtype: string - name: output dtype: string splits: - name: train num_bytes: 6987099 num_examples: 4000 download_size: 4094893 dataset_size: 6987099 configs: - config_name: default data_files: - split: train path: data/train-* ---
open-llm-leaderboard/details_andysalerno__rainbowfish-7B-v9
--- pretty_name: Evaluation run of andysalerno/rainbowfish-7B-v9 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [andysalerno/rainbowfish-7B-v9](https://huggingface.co/andysalerno/rainbowfish-7B-v9)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andysalerno__rainbowfish-7B-v9\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-02-12T20:14:46.042064](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__rainbowfish-7B-v9/blob/main/results_2024-02-12T20-14-46.042064.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6270678144594366,\n\ \ \"acc_stderr\": 0.03256950382813242,\n \"acc_norm\": 0.6331149722051534,\n\ \ \"acc_norm_stderr\": 0.033232016922454914,\n \"mc1\": 0.32558139534883723,\n\ \ \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.488208971713914,\n\ \ \"mc2_stderr\": 0.01514417520712263\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5827645051194539,\n \"acc_stderr\": 0.01440982551840308,\n\ \ \"acc_norm\": 0.6177474402730375,\n \"acc_norm_stderr\": 0.014200454049979277\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6320454092810197,\n\ \ \"acc_stderr\": 0.004812633280078265,\n \"acc_norm\": 0.8243377813184625,\n\ \ \"acc_norm_stderr\": 0.003797548252851631\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\ \ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\ \ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n\ \ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\ \ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \ \ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\ \ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\ \ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\ \ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \ \ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n\ \ \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \ \ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\ \ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\ \ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\ \ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\ \ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\ \ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\ \ \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n\ \ \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\ \ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.3915343915343915,\n \"acc_stderr\": 0.02513809138885111,\n \"\ acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.02513809138885111\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\ \ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\ \ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \ \ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.7580645161290323,\n \"acc_stderr\": 0.024362599693031086,\n \"\ acc_norm\": 0.7580645161290323,\n \"acc_norm_stderr\": 0.024362599693031086\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"\ acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\ : 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\ \ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\ acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306443,\n\ \ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306443\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.024468615241478926,\n\ \ \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.024468615241478926\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948492,\n \ \ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948492\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\ \ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\ acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8110091743119267,\n \"acc_stderr\": 0.016785481159203627,\n \"\ acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.016785481159203627\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.47685185185185186,\n \"acc_stderr\": 0.034063153607115065,\n \"\ acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.034063153607115065\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8088235294117647,\n \"acc_stderr\": 0.02759917430064076,\n \"\ acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.02759917430064076\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \ \ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\ \ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\ \ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\ \ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163046,\n \"\ acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163046\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\ \ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\ \ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\ \ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\ \ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\ \ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\ \ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\ \ \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n\ \ \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8033205619412516,\n\ \ \"acc_stderr\": 0.014214138556913915,\n \"acc_norm\": 0.8033205619412516,\n\ \ \"acc_norm_stderr\": 0.014214138556913915\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n\ \ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3094972067039106,\n\ \ \"acc_stderr\": 0.015461169002371537,\n \"acc_norm\": 0.3094972067039106,\n\ \ \"acc_norm_stderr\": 0.015461169002371537\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\ \ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\ \ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\ \ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495036,\n\ \ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495036\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \ \ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4491525423728814,\n\ \ \"acc_stderr\": 0.012704030518851488,\n \"acc_norm\": 0.4491525423728814,\n\ \ \"acc_norm_stderr\": 0.012704030518851488\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.02850145286039656,\n\ \ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.02850145286039656\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6535947712418301,\n \"acc_stderr\": 0.019249785691717206,\n \ \ \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.019249785691717206\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\ \ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\ \ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n\ \ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\ \ \"acc_stderr\": 0.026508590656233278,\n \"acc_norm\": 0.8308457711442786,\n\ \ \"acc_norm_stderr\": 0.026508590656233278\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \ \ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\ \ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\ \ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\ \ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32558139534883723,\n\ \ \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.488208971713914,\n\ \ \"mc2_stderr\": 0.01514417520712263\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.011705697565205201\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3479909021986353,\n \ \ \"acc_stderr\": 0.013120581030382132\n }\n}\n```" repo_url: https://huggingface.co/andysalerno/rainbowfish-7B-v9 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|arc:challenge|25_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-02-12T20-14-46.042064.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|gsm8k|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hellaswag|10_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-12T20-14-46.042064.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-management|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-virology|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T20-14-46.042064.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|truthfulqa:mc|0_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-02-12T20-14-46.042064.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_02_12T20_14_46.042064 path: - '**/details_harness|winogrande|5_2024-02-12T20-14-46.042064.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-02-12T20-14-46.042064.parquet' - config_name: results data_files: - split: 2024_02_12T20_14_46.042064 path: - results_2024-02-12T20-14-46.042064.parquet - split: latest path: - results_2024-02-12T20-14-46.042064.parquet --- # Dataset Card for Evaluation run of andysalerno/rainbowfish-7B-v9 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [andysalerno/rainbowfish-7B-v9](https://huggingface.co/andysalerno/rainbowfish-7B-v9) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_andysalerno__rainbowfish-7B-v9", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-12T20:14:46.042064](https://huggingface.co/datasets/open-llm-leaderboard/details_andysalerno__rainbowfish-7B-v9/blob/main/results_2024-02-12T20-14-46.042064.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6270678144594366, "acc_stderr": 0.03256950382813242, "acc_norm": 0.6331149722051534, "acc_norm_stderr": 0.033232016922454914, "mc1": 0.32558139534883723, "mc1_stderr": 0.016403989469907825, "mc2": 0.488208971713914, "mc2_stderr": 0.01514417520712263 }, "harness|arc:challenge|25": { "acc": 0.5827645051194539, "acc_stderr": 0.01440982551840308, "acc_norm": 0.6177474402730375, "acc_norm_stderr": 0.014200454049979277 }, "harness|hellaswag|10": { "acc": 0.6320454092810197, "acc_stderr": 0.004812633280078265, "acc_norm": 0.8243377813184625, "acc_norm_stderr": 0.003797548252851631 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.29, "acc_stderr": 0.045604802157206845, "acc_norm": 0.29, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6222222222222222, "acc_stderr": 0.04188307537595852, "acc_norm": 0.6222222222222222, "acc_norm_stderr": 0.04188307537595852 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6447368421052632, "acc_stderr": 0.03894734487013317, "acc_norm": 0.6447368421052632, "acc_norm_stderr": 0.03894734487013317 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.57, "acc_stderr": 0.049756985195624284, "acc_norm": 0.57, "acc_norm_stderr": 0.049756985195624284 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.690566037735849, "acc_stderr": 0.028450154794118637, "acc_norm": 0.690566037735849, "acc_norm_stderr": 0.028450154794118637 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7430555555555556, "acc_stderr": 0.03653946969442099, "acc_norm": 0.7430555555555556, "acc_norm_stderr": 0.03653946969442099 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.05021167315686779, "acc_norm": 0.52, "acc_norm_stderr": 0.05021167315686779 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.38, "acc_stderr": 0.04878317312145632, "acc_norm": 0.38, "acc_norm_stderr": 0.04878317312145632 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6184971098265896, "acc_stderr": 0.03703851193099521, "acc_norm": 0.6184971098265896, "acc_norm_stderr": 0.03703851193099521 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.04810840148082636, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.04810840148082636 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.79, "acc_stderr": 0.04093601807403326, "acc_norm": 0.79, "acc_norm_stderr": 0.04093601807403326 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5531914893617021, "acc_stderr": 0.0325005368436584, "acc_norm": 0.5531914893617021, "acc_norm_stderr": 0.0325005368436584 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.047028804320496165, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.047028804320496165 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5379310344827586, "acc_stderr": 0.04154659671707548, "acc_norm": 0.5379310344827586, "acc_norm_stderr": 0.04154659671707548 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.3915343915343915, "acc_stderr": 0.02513809138885111, "acc_norm": 0.3915343915343915, "acc_norm_stderr": 0.02513809138885111 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4444444444444444, "acc_stderr": 0.044444444444444495, "acc_norm": 0.4444444444444444, "acc_norm_stderr": 0.044444444444444495 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7580645161290323, "acc_stderr": 0.024362599693031086, "acc_norm": 0.7580645161290323, "acc_norm_stderr": 0.024362599693031086 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5024630541871922, "acc_stderr": 0.035179450386910616, "acc_norm": 0.5024630541871922, "acc_norm_stderr": 0.035179450386910616 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7878787878787878, "acc_stderr": 0.03192271569548301, "acc_norm": 0.7878787878787878, "acc_norm_stderr": 0.03192271569548301 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7676767676767676, "acc_stderr": 0.030088629490217487, "acc_norm": 0.7676767676767676, "acc_norm_stderr": 0.030088629490217487 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8549222797927462, "acc_stderr": 0.025416343096306443, "acc_norm": 0.8549222797927462, "acc_norm_stderr": 0.025416343096306443 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6307692307692307, "acc_stderr": 0.024468615241478926, "acc_norm": 0.6307692307692307, "acc_norm_stderr": 0.024468615241478926 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.028742040903948492, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.028742040903948492 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6428571428571429, "acc_stderr": 0.031124619309328177, "acc_norm": 0.6428571428571429, "acc_norm_stderr": 0.031124619309328177 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33774834437086093, "acc_stderr": 0.03861557546255169, "acc_norm": 0.33774834437086093, "acc_norm_stderr": 0.03861557546255169 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8110091743119267, "acc_stderr": 0.016785481159203627, "acc_norm": 0.8110091743119267, "acc_norm_stderr": 0.016785481159203627 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.47685185185185186, "acc_stderr": 0.034063153607115065, "acc_norm": 0.47685185185185186, "acc_norm_stderr": 0.034063153607115065 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8088235294117647, "acc_stderr": 0.02759917430064076, "acc_norm": 0.8088235294117647, "acc_norm_stderr": 0.02759917430064076 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7510548523206751, "acc_stderr": 0.028146970599422644, "acc_norm": 0.7510548523206751, "acc_norm_stderr": 0.028146970599422644 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.672645739910314, "acc_stderr": 0.03149384670994131, "acc_norm": 0.672645739910314, "acc_norm_stderr": 0.03149384670994131 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7520661157024794, "acc_stderr": 0.039418975265163046, "acc_norm": 0.7520661157024794, "acc_norm_stderr": 0.039418975265163046 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7685185185185185, "acc_stderr": 0.04077494709252627, "acc_norm": 0.7685185185185185, "acc_norm_stderr": 0.04077494709252627 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7484662576687117, "acc_stderr": 0.03408997886857529, "acc_norm": 0.7484662576687117, "acc_norm_stderr": 0.03408997886857529 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.44642857142857145, "acc_stderr": 0.04718471485219588, "acc_norm": 0.44642857142857145, "acc_norm_stderr": 0.04718471485219588 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.04058042015646034, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.04058042015646034 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8589743589743589, "acc_stderr": 0.02280138253459753, "acc_norm": 0.8589743589743589, "acc_norm_stderr": 0.02280138253459753 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.69, "acc_stderr": 0.04648231987117316, "acc_norm": 0.69, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8033205619412516, "acc_stderr": 0.014214138556913915, "acc_norm": 0.8033205619412516, "acc_norm_stderr": 0.014214138556913915 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7196531791907514, "acc_stderr": 0.02418242749657761, "acc_norm": 0.7196531791907514, "acc_norm_stderr": 0.02418242749657761 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.3094972067039106, "acc_stderr": 0.015461169002371537, "acc_norm": 0.3094972067039106, "acc_norm_stderr": 0.015461169002371537 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.738562091503268, "acc_stderr": 0.025160998214292456, "acc_norm": 0.738562091503268, "acc_norm_stderr": 0.025160998214292456 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.707395498392283, "acc_stderr": 0.02583989833487798, "acc_norm": 0.707395498392283, "acc_norm_stderr": 0.02583989833487798 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7098765432098766, "acc_stderr": 0.025251173936495036, "acc_norm": 0.7098765432098766, "acc_norm_stderr": 0.025251173936495036 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48936170212765956, "acc_stderr": 0.029820747191422473, "acc_norm": 0.48936170212765956, "acc_norm_stderr": 0.029820747191422473 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4491525423728814, "acc_stderr": 0.012704030518851488, "acc_norm": 0.4491525423728814, "acc_norm_stderr": 0.012704030518851488 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6727941176470589, "acc_stderr": 0.02850145286039656, "acc_norm": 0.6727941176470589, "acc_norm_stderr": 0.02850145286039656 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6535947712418301, "acc_stderr": 0.019249785691717206, "acc_norm": 0.6535947712418301, "acc_norm_stderr": 0.019249785691717206 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.726530612244898, "acc_stderr": 0.028535560337128445, "acc_norm": 0.726530612244898, "acc_norm_stderr": 0.028535560337128445 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8308457711442786, "acc_stderr": 0.026508590656233278, "acc_norm": 0.8308457711442786, "acc_norm_stderr": 0.026508590656233278 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.03379976689896308, "acc_norm": 0.87, "acc_norm_stderr": 0.03379976689896308 }, "harness|hendrycksTest-virology|5": { "acc": 0.5481927710843374, "acc_stderr": 0.03874371556587953, "acc_norm": 0.5481927710843374, "acc_norm_stderr": 0.03874371556587953 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8070175438596491, "acc_stderr": 0.030267457554898458, "acc_norm": 0.8070175438596491, "acc_norm_stderr": 0.030267457554898458 }, "harness|truthfulqa:mc|0": { "mc1": 0.32558139534883723, "mc1_stderr": 0.016403989469907825, "mc2": 0.488208971713914, "mc2_stderr": 0.01514417520712263 }, "harness|winogrande|5": { "acc": 0.77663772691397, "acc_stderr": 0.011705697565205201 }, "harness|gsm8k|5": { "acc": 0.3479909021986353, "acc_stderr": 0.013120581030382132 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
ericflo/unnaturalhermes-reflections-30k
--- license: apache-2.0 ---
CyberHarem/manchester_azurlane
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of manchester/マンチェスター/曼彻斯特 (Azur Lane) This is the dataset of manchester/マンチェスター/曼彻斯特 (Azur Lane), containing 29 images and their tags. The core tags of this character are `breasts, bangs, large_breasts, grey_hair, green_eyes, hair_bun, maid_headdress, short_hair, hat, nurse_cap, symbol-shaped_pupils`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 29 | 58.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/manchester_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 29 | 25.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/manchester_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 74 | 60.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/manchester_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 29 | 47.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/manchester_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 74 | 98.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/manchester_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/manchester_azurlane', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, navel, solo, black_bikini, blush, smile, collarbone, cleavage, open_mouth, twintails, aqua_eyes, maid_bikini, bare_shoulders, black_choker, frilled_bikini, outdoors, sitting, twin_braids, wrist_cuffs, bridal_garter, closed_mouth, nipples, side-tie_bikini_bottom, x_hair_ornament | | 1 | 10 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, looking_at_viewer, solo, white_gloves, blush, nurse, short_sleeves, shrug_(clothing), white_thighhighs, bra, demon_wings, holding_syringe, demon_tail, heart-shaped_pupils, navel, open_mouth, simple_background, sitting, smile, white_skirt, garter_straps, medium_breasts, single_hair_bun, white_background | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | navel | solo | black_bikini | blush | smile | collarbone | cleavage | open_mouth | twintails | aqua_eyes | maid_bikini | bare_shoulders | black_choker | frilled_bikini | outdoors | sitting | twin_braids | wrist_cuffs | bridal_garter | closed_mouth | nipples | side-tie_bikini_bottom | x_hair_ornament | white_gloves | nurse | short_sleeves | shrug_(clothing) | white_thighhighs | bra | demon_wings | holding_syringe | demon_tail | heart-shaped_pupils | simple_background | white_skirt | garter_straps | medium_breasts | single_hair_bun | white_background | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-------|:---------------|:--------|:--------|:-------------|:-----------|:-------------|:------------|:------------|:--------------|:-----------------|:---------------|:-----------------|:-----------|:----------|:--------------|:--------------|:----------------|:---------------|:----------|:-------------------------|:------------------|:---------------|:--------|:----------------|:-------------------|:-------------------|:------|:--------------|:------------------|:-------------|:----------------------|:--------------------|:--------------|:----------------|:-----------------|:------------------|:-------------------| | 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | 1 | 10 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | | X | X | | | X | | | | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
reginaboateng/pubmed_long_tokenised
--- dataset_info: features: - name: input_ids sequence: int32 - name: token_type_ids sequence: int8 - name: attention_mask sequence: int8 - name: labels sequence: int64 splits: - name: train num_bytes: 493127488 num_examples: 119924 - name: validation num_bytes: 27274896 num_examples: 6633 - name: test num_bytes: 27377696 num_examples: 6658 download_size: 153946164 dataset_size: 547780080 --- # Dataset Card for "pubmed_long_tokenised" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
sshreyy/global_classify_v2_2xtrain
--- dataset_info: features: - name: text dtype: string - name: labels dtype: int64 splits: - name: train num_bytes: 8135110 num_examples: 3890 - name: test num_bytes: 858889 num_examples: 405 download_size: 2239783 dataset_size: 8993999 --- # Dataset Card for "global_classify_v2_2xtrain" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
allennghayoui/mistral-code-assistant-finetune
--- dataset_info: features: - name: prompt dtype: string splits: - name: train num_bytes: 184976.14583333334 num_examples: 172 - name: validation num_bytes: 10754.427083333334 num_examples: 10 - name: test num_bytes: 23839 num_examples: 30 download_size: 77625 dataset_size: 219569.5729166667 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* ---
notzero/oasstdt_step2
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 7099035 num_examples: 9845 - name: validation num_bytes: 58976 num_examples: 96 download_size: 3429118 dataset_size: 7158011 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* ---
whatisslove11/320_ms_eval
--- dataset_info: features: - name: audio dtype: audio: sampling_rate: 16000 - name: label dtype: class_label: names: '0': normal_speech '1': whisper '2': music '3': scream splits: - name: train num_bytes: 178892132.33 num_examples: 3166 download_size: 164485752 dataset_size: 178892132.33 configs: - config_name: default data_files: - split: train path: data/train-* ---
hippocrates/guideline_prediction_train
--- dataset_info: features: - name: id dtype: string - name: conversations list: - name: from dtype: string - name: value dtype: string - name: text dtype: string splits: - name: train num_bytes: 272615754 num_examples: 28014 download_size: 142120670 dataset_size: 272615754 configs: - config_name: default data_files: - split: train path: data/train-* ---
arielnlee/Realistic-Occlusion-Dataset
--- license: other task_categories: - image-classification language: - en tags: - occlusion size_categories: - 1K<n<10K dataset_info: features: - name: image dtype: image - name: label dtype: class_label: names: '0': banana '1': baseball '2': cowboy hat '3': cup '4': dumbbell '5': hammer '6': laptop '7': microwave '8': mouse '9': orange '10': pillow '11': plate '12': screwdriver '13': skillet '14': spatula '15': vase splits: - name: ROD num_bytes: 3306212413 num_examples: 1231 download_size: 3285137456 dataset_size: 3306212413 --- # Real Occlusion Dataset (ROD) The Realistic Occlusion Dataset is the product of a meticulous object collection protocol aimed at collecting and capturing 40+ distinct objects from 16 classes: <strong>banana, baseball, cowboy hat, cup, dumbbell, hammer, laptop, microwave, mouse, orange, pillow, plate, screwdriver, skillet, spatula, and vase.</strong> Images are taken in a bright room with soft, natural light. All objects are captured on a brown wooden table against a solid colored wall. An iPhone 13 Pro ultra-wide camera with a tripod is used to capture images at an elevation of approx. 90 degrees and distance of 1 meter from the object. Occluder objects are wooden blocks or square pieces of cardboard, painted red or blue. The occluder object is added between the camera and the main object and its x-axis position is varied such that it begins at the left of the frame and ends at the right. In total, 1 clean image and 12 occluded images are captured for each object. Each object is measured and the occluder step size is broken up into equal sizes. ROD was created for testing model robustness to occlusion in [Hardwiring ViT Patch Selectivity into CNNs using Patch Mixing](https://arielnlee.github.io/PatchMixing/). ## Citations ```bibtex @misc{lee2023hardwiring, title={Hardwiring ViT Patch Selectivity into CNNs using Patch Mixing}, author={Ariel N. Lee and Sarah Adel Bargal and Janavi Kasera and Stan Sclaroff and Kate Saenko and Nataniel Ruiz}, year={2023}, eprint={2306.17848}, archivePrefix={arXiv}, primaryClass={cs.CV} } ```
Shouq95/turkeytreedata
--- tags: - trees ---
autoevaluate/autoeval-eval-inverse-scaling__hindsight-neglect-10shot-inverse-scali-383fe9-1695459610
--- type: predictions tags: - autotrain - evaluation datasets: - inverse-scaling/hindsight-neglect-10shot eval_info: task: text_zero_shot_classification model: inverse-scaling/opt-6.7b_eval metrics: [] dataset_name: inverse-scaling/hindsight-neglect-10shot dataset_config: inverse-scaling--hindsight-neglect-10shot dataset_split: train col_mapping: text: prompt classes: classes target: answer_index --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Zero-Shot Text Classification * Model: inverse-scaling/opt-6.7b_eval * Dataset: inverse-scaling/hindsight-neglect-10shot * Config: inverse-scaling--hindsight-neglect-10shot * Split: train To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@MicPie](https://huggingface.co/MicPie) for evaluating this model.
multi-train/agnews_1107
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: query dtype: string - name: pos sequence: string - name: neg sequence: string - name: task dtype: string - name: instruction struct: - name: query dtype: string - name: pos dtype: string - name: neg dtype: string splits: - name: train num_bytes: 98773974 num_examples: 200000 download_size: 50174968 dataset_size: 98773974 --- # Dataset Card for "agnews_1107" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
MichiganNLP/svo_probes
--- license: cc-by-4.0 language: - en pretty_name: SVO-Probes size_categories: - 10K<n<100K --- # SVO-Probes This dataset comes from https://github.com/deepmind/svo_probes. ## Usage ```python from datasets import load_dataset # Note that the following line says "train" split, but there are actually no splits in this dataset. dataset = load_dataset("MichiganNLP/svo_probes", split="train") # To see an example, access the first element of the dataset with `dataset[0]`. ```
open-llm-leaderboard/details_nbeerbower__flammen3X
--- pretty_name: Evaluation run of nbeerbower/flammen3X dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [nbeerbower/flammen3X](https://huggingface.co/nbeerbower/flammen3X) on the [Open\ \ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nbeerbower__flammen3X\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-09T18:21:14.905349](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__flammen3X/blob/main/results_2024-03-09T18-21-14.905349.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6511768014924072,\n\ \ \"acc_stderr\": 0.03210859946330444,\n \"acc_norm\": 0.6516990000626086,\n\ \ \"acc_norm_stderr\": 0.03276278450908129,\n \"mc1\": 0.5030599755201959,\n\ \ \"mc1_stderr\": 0.017503173260960625,\n \"mc2\": 0.6636783326582055,\n\ \ \"mc2_stderr\": 0.015118018923624023\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6663822525597269,\n \"acc_stderr\": 0.013778687054176538,\n\ \ \"acc_norm\": 0.7013651877133106,\n \"acc_norm_stderr\": 0.01337407861506874\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6964748058155746,\n\ \ \"acc_stderr\": 0.004588403419449665,\n \"acc_norm\": 0.8737303326030671,\n\ \ \"acc_norm_stderr\": 0.003314742077083305\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \ \ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\ \ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\ \ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\ \ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\ \ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \ \ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569526,\n\ \ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569526\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\ \ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\ \ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \ \ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\ : 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\ \ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\ \ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\ \ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\ \ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\ \ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\ \ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\ \ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\ \ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\ acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\ \ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\ \ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\ \ \"acc_stderr\": 0.02328766512726854,\n \"acc_norm\": 0.7870967741935484,\n\ \ \"acc_norm_stderr\": 0.02328766512726854\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\ \ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\ : 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\ \ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"\ acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121427,\n\ \ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121427\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \ \ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948475,\n \ \ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948475\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n\ \ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\ acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507337,\n \"\ acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507337\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\ acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\ acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8227848101265823,\n \"acc_stderr\": 0.02485636418450322,\n \ \ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.02485636418450322\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\ \ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\ \ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n\ \ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\ : 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\ \ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\ \ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\ \ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\ \ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \ \ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\ \ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\ \ \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n\ \ \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \ \ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n\ \ \"acc_stderr\": 0.013468201614066307,\n \"acc_norm\": 0.8288633461047255,\n\ \ \"acc_norm_stderr\": 0.013468201614066307\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265026,\n\ \ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265026\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43910614525139663,\n\ \ \"acc_stderr\": 0.01659802212058043,\n \"acc_norm\": 0.43910614525139663,\n\ \ \"acc_norm_stderr\": 0.01659802212058043\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.02573885479781873,\n\ \ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.02573885479781873\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\ \ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\ \ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \ \ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\ : 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"\ acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\ \ \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n\ \ \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.7132352941176471,\n \"acc_stderr\": 0.027472274473233815,\n\ \ \"acc_norm\": 0.7132352941176471,\n \"acc_norm_stderr\": 0.027472274473233815\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6617647058823529,\n \"acc_stderr\": 0.01913994374848704,\n \ \ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.01913994374848704\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\ \ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\ \ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\ \ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\ \ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\ \ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \ \ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\ \ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\ \ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\ \ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\ \ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5030599755201959,\n\ \ \"mc1_stderr\": 0.017503173260960625,\n \"mc2\": 0.6636783326582055,\n\ \ \"mc2_stderr\": 0.015118018923624023\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8097868981846882,\n \"acc_stderr\": 0.01103033579861744\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6739954510993177,\n \ \ \"acc_stderr\": 0.012911675645682841\n }\n}\n```" repo_url: https://huggingface.co/nbeerbower/flammen3X leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|arc:challenge|25_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-09T18-21-14.905349.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|gsm8k|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hellaswag|10_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-09T18-21-14.905349.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-management|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T18-21-14.905349.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|truthfulqa:mc|0_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-09T18-21-14.905349.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_09T18_21_14.905349 path: - '**/details_harness|winogrande|5_2024-03-09T18-21-14.905349.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-09T18-21-14.905349.parquet' - config_name: results data_files: - split: 2024_03_09T18_21_14.905349 path: - results_2024-03-09T18-21-14.905349.parquet - split: latest path: - results_2024-03-09T18-21-14.905349.parquet --- # Dataset Card for Evaluation run of nbeerbower/flammen3X <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [nbeerbower/flammen3X](https://huggingface.co/nbeerbower/flammen3X) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_nbeerbower__flammen3X", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-09T18:21:14.905349](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__flammen3X/blob/main/results_2024-03-09T18-21-14.905349.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6511768014924072, "acc_stderr": 0.03210859946330444, "acc_norm": 0.6516990000626086, "acc_norm_stderr": 0.03276278450908129, "mc1": 0.5030599755201959, "mc1_stderr": 0.017503173260960625, "mc2": 0.6636783326582055, "mc2_stderr": 0.015118018923624023 }, "harness|arc:challenge|25": { "acc": 0.6663822525597269, "acc_stderr": 0.013778687054176538, "acc_norm": 0.7013651877133106, "acc_norm_stderr": 0.01337407861506874 }, "harness|hellaswag|10": { "acc": 0.6964748058155746, "acc_stderr": 0.004588403419449665, "acc_norm": 0.8737303326030671, "acc_norm_stderr": 0.003314742077083305 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.38, "acc_stderr": 0.048783173121456316, "acc_norm": 0.38, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6444444444444445, "acc_stderr": 0.04135176749720385, "acc_norm": 0.6444444444444445, "acc_norm_stderr": 0.04135176749720385 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6973684210526315, "acc_stderr": 0.03738520676119669, "acc_norm": 0.6973684210526315, "acc_norm_stderr": 0.03738520676119669 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.02804918631569526, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.02804918631569526 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.52, "acc_stderr": 0.050211673156867795, "acc_norm": 0.52, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6589595375722543, "acc_stderr": 0.03614665424180826, "acc_norm": 0.6589595375722543, "acc_norm_stderr": 0.03614665424180826 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4215686274509804, "acc_stderr": 0.04913595201274498, "acc_norm": 0.4215686274509804, "acc_norm_stderr": 0.04913595201274498 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.77, "acc_stderr": 0.04229525846816507, "acc_norm": 0.77, "acc_norm_stderr": 0.04229525846816507 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5659574468085107, "acc_stderr": 0.03240038086792747, "acc_norm": 0.5659574468085107, "acc_norm_stderr": 0.03240038086792747 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.49122807017543857, "acc_stderr": 0.04702880432049615, "acc_norm": 0.49122807017543857, "acc_norm_stderr": 0.04702880432049615 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5862068965517241, "acc_stderr": 0.04104269211806232, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.40476190476190477, "acc_stderr": 0.025279850397404904, "acc_norm": 0.40476190476190477, "acc_norm_stderr": 0.025279850397404904 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.48412698412698413, "acc_stderr": 0.04469881854072606, "acc_norm": 0.48412698412698413, "acc_norm_stderr": 0.04469881854072606 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7870967741935484, "acc_stderr": 0.02328766512726854, "acc_norm": 0.7870967741935484, "acc_norm_stderr": 0.02328766512726854 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.4975369458128079, "acc_stderr": 0.03517945038691063, "acc_norm": 0.4975369458128079, "acc_norm_stderr": 0.03517945038691063 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7828282828282829, "acc_stderr": 0.02937661648494563, "acc_norm": 0.7828282828282829, "acc_norm_stderr": 0.02937661648494563 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8808290155440415, "acc_stderr": 0.023381935348121427, "acc_norm": 0.8808290155440415, "acc_norm_stderr": 0.023381935348121427 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.658974358974359, "acc_stderr": 0.02403548967633508, "acc_norm": 0.658974358974359, "acc_norm_stderr": 0.02403548967633508 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3333333333333333, "acc_stderr": 0.028742040903948475, "acc_norm": 0.3333333333333333, "acc_norm_stderr": 0.028742040903948475 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7142857142857143, "acc_stderr": 0.029344572500634332, "acc_norm": 0.7142857142857143, "acc_norm_stderr": 0.029344572500634332 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3576158940397351, "acc_stderr": 0.03913453431177258, "acc_norm": 0.3576158940397351, "acc_norm_stderr": 0.03913453431177258 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8330275229357799, "acc_stderr": 0.01599015488507337, "acc_norm": 0.8330275229357799, "acc_norm_stderr": 0.01599015488507337 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5092592592592593, "acc_stderr": 0.034093869469927006, "acc_norm": 0.5092592592592593, "acc_norm_stderr": 0.034093869469927006 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8333333333333334, "acc_stderr": 0.026156867523931045, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.026156867523931045 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8227848101265823, "acc_stderr": 0.02485636418450322, "acc_norm": 0.8227848101265823, "acc_norm_stderr": 0.02485636418450322 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6816143497757847, "acc_stderr": 0.03126580522513713, "acc_norm": 0.6816143497757847, "acc_norm_stderr": 0.03126580522513713 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7557251908396947, "acc_stderr": 0.03768335959728744, "acc_norm": 0.7557251908396947, "acc_norm_stderr": 0.03768335959728744 }, "harness|hendrycksTest-international_law|5": { "acc": 0.768595041322314, "acc_stderr": 0.03849856098794088, "acc_norm": 0.768595041322314, "acc_norm_stderr": 0.03849856098794088 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7962962962962963, "acc_stderr": 0.03893542518824847, "acc_norm": 0.7962962962962963, "acc_norm_stderr": 0.03893542518824847 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.754601226993865, "acc_stderr": 0.03380939813943354, "acc_norm": 0.754601226993865, "acc_norm_stderr": 0.03380939813943354 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4375, "acc_stderr": 0.04708567521880525, "acc_norm": 0.4375, "acc_norm_stderr": 0.04708567521880525 }, "harness|hendrycksTest-management|5": { "acc": 0.7961165048543689, "acc_stderr": 0.03989139859531771, "acc_norm": 0.7961165048543689, "acc_norm_stderr": 0.03989139859531771 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8760683760683761, "acc_stderr": 0.021586494001281376, "acc_norm": 0.8760683760683761, "acc_norm_stderr": 0.021586494001281376 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8288633461047255, "acc_stderr": 0.013468201614066307, "acc_norm": 0.8288633461047255, "acc_norm_stderr": 0.013468201614066307 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7254335260115607, "acc_stderr": 0.024027745155265026, "acc_norm": 0.7254335260115607, "acc_norm_stderr": 0.024027745155265026 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.43910614525139663, "acc_stderr": 0.01659802212058043, "acc_norm": 0.43910614525139663, "acc_norm_stderr": 0.01659802212058043 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7189542483660131, "acc_stderr": 0.02573885479781873, "acc_norm": 0.7189542483660131, "acc_norm_stderr": 0.02573885479781873 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7106109324758842, "acc_stderr": 0.025755865922632945, "acc_norm": 0.7106109324758842, "acc_norm_stderr": 0.025755865922632945 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.75, "acc_stderr": 0.02409347123262133, "acc_norm": 0.75, "acc_norm_stderr": 0.02409347123262133 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.4858156028368794, "acc_stderr": 0.02981549448368206, "acc_norm": 0.4858156028368794, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.4680573663624511, "acc_stderr": 0.012744149704869647, "acc_norm": 0.4680573663624511, "acc_norm_stderr": 0.012744149704869647 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7132352941176471, "acc_stderr": 0.027472274473233815, "acc_norm": 0.7132352941176471, "acc_norm_stderr": 0.027472274473233815 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6617647058823529, "acc_stderr": 0.01913994374848704, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.01913994374848704 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7428571428571429, "acc_stderr": 0.02797982353874455, "acc_norm": 0.7428571428571429, "acc_norm_stderr": 0.02797982353874455 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8258706467661692, "acc_stderr": 0.026814951200421603, "acc_norm": 0.8258706467661692, "acc_norm_stderr": 0.026814951200421603 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.0348735088019777, "acc_norm": 0.86, "acc_norm_stderr": 0.0348735088019777 }, "harness|hendrycksTest-virology|5": { "acc": 0.536144578313253, "acc_stderr": 0.038823108508905954, "acc_norm": 0.536144578313253, "acc_norm_stderr": 0.038823108508905954 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8362573099415205, "acc_stderr": 0.028380919596145866, "acc_norm": 0.8362573099415205, "acc_norm_stderr": 0.028380919596145866 }, "harness|truthfulqa:mc|0": { "mc1": 0.5030599755201959, "mc1_stderr": 0.017503173260960625, "mc2": 0.6636783326582055, "mc2_stderr": 0.015118018923624023 }, "harness|winogrande|5": { "acc": 0.8097868981846882, "acc_stderr": 0.01103033579861744 }, "harness|gsm8k|5": { "acc": 0.6739954510993177, "acc_stderr": 0.012911675645682841 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
joey234/mmlu-high_school_biology-neg-prepend-verbal
--- configs: - config_name: default data_files: - split: dev path: data/dev-* - split: test path: data/test-* dataset_info: features: - name: question dtype: string - name: choices sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D - name: negate_openai_prompt struct: - name: content dtype: string - name: role dtype: string - name: neg_question dtype: string - name: fewshot_context dtype: string - name: ori_prompt dtype: string - name: neg_prompt dtype: string - name: fewshot_context_neg dtype: string - name: fewshot_context_ori dtype: string splits: - name: dev num_bytes: 8658 num_examples: 5 - name: test num_bytes: 3196056 num_examples: 310 download_size: 327280 dataset_size: 3204714 --- # Dataset Card for "mmlu-high_school_biology-neg-prepend-verbal" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Acumen/Test1
--- license: unknown ---
DFKI-SLT/science_ie
--- annotations_creators: - expert-generated language: - en language_creators: - found license: - other multilinguality: - monolingual pretty_name: ScienceIE is a dataset for the SemEval task of extracting key phrases and relations between them from scientific documents size_categories: - 1K<n<10K source_datasets: [] tags: - research papers - scientific papers task_categories: - token-classification - text-classification task_ids: - named-entity-recognition - multi-class-classification dataset_info: - config_name: ner features: - name: id dtype: string - name: tokens sequence: string - name: tags sequence: class_label: names: '0': O '1': B-Material '2': I-Material '3': B-Process '4': I-Process '5': B-Task '6': I-Task splits: - name: train num_bytes: 1185670 num_examples: 2388 - name: validation num_bytes: 204095 num_examples: 400 - name: test num_bytes: 399069 num_examples: 838 download_size: 13704567 dataset_size: 1788834 - config_name: re features: - name: id dtype: string - name: tokens dtype: string - name: arg1_start dtype: int32 - name: arg1_end dtype: int32 - name: arg1_type dtype: string - name: arg2_start dtype: int32 - name: arg2_end dtype: int32 - name: arg2_type dtype: string - name: relation dtype: class_label: names: '0': O '1': Synonym-of '2': Hyponym-of splits: - name: train num_bytes: 11738520 num_examples: 24558 - name: validation num_bytes: 2347796 num_examples: 4838 - name: test num_bytes: 2835275 num_examples: 6618 download_size: 13704567 dataset_size: 16921591 - config_name: subtask_a features: - name: id dtype: string - name: tokens sequence: string - name: tags sequence: class_label: names: '0': O '1': B '2': I splits: - name: train num_bytes: 1185670 num_examples: 2388 - name: validation num_bytes: 204095 num_examples: 400 - name: test num_bytes: 399069 num_examples: 838 download_size: 13704567 dataset_size: 1788834 - config_name: subtask_b features: - name: id dtype: string - name: tokens sequence: string - name: tags sequence: class_label: names: '0': O '1': M '2': P '3': T splits: - name: train num_bytes: 1185670 num_examples: 2388 - name: validation num_bytes: 204095 num_examples: 400 - name: test num_bytes: 399069 num_examples: 838 download_size: 13704567 dataset_size: 1788834 - config_name: subtask_c features: - name: id dtype: string - name: tokens sequence: string - name: tags sequence: sequence: class_label: names: '0': O '1': S '2': H splits: - name: train num_bytes: 20103682 num_examples: 2388 - name: validation num_bytes: 3575511 num_examples: 400 - name: test num_bytes: 6431513 num_examples: 838 download_size: 13704567 dataset_size: 30110706 --- # Dataset Card for ScienceIE ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** [https://scienceie.github.io/index.html](https://scienceie.github.io/index.html) - **Repository:** [https://github.com/ScienceIE/scienceie.github.io](https://github.com/ScienceIE/scienceie.github.io) - **Paper:** [SemEval 2017 Task 10: ScienceIE - Extracting Keyphrases and Relations from Scientific Publications](https://arxiv.org/abs/1704.02853) - **Leaderboard:** [https://competitions.codalab.org/competitions/15898](https://competitions.codalab.org/competitions/15898) - **Size of downloaded dataset files:** 13.7 MB - **Size of generated dataset files:** 17.4 MB ### Dataset Summary ScienceIE is a dataset for the SemEval task of extracting key phrases and relations between them from scientific documents. A corpus for the task was built from ScienceDirect open access publications and was available freely for participants, without the need to sign a copyright agreement. Each data instance consists of one paragraph of text, drawn from a scientific paper. Publications were provided in plain text, in addition to xml format, which included the full text of the publication as well as additional metadata. 500 paragraphs from journal articles evenly distributed among the domains Computer Science, Material Sciences and Physics were selected. The training data part of the corpus consists of 350 documents, 50 for development and 100 for testing. This is similar to the pilot task described in Section 5, for which 144 articles were used for training, 40 for development and for 100 testing. There are three subtasks: - Subtask (A): Identification of keyphrases - Given a scientific publication, the goal of this task is to identify all the keyphrases in the document. - Subtask (B): Classification of identified keyphrases - In this task, each keyphrase needs to be labelled by one of three types: (i) PROCESS, (ii) TASK, and (iii) MATERIAL. - PROCESS: Keyphrases relating to some scientific model, algorithm or process should be labelled by PROCESS. - TASK: Keyphrases those denote the application, end goal, problem, task should be labelled by TASK. - MATERIAL: MATERIAL keyphrases identify the resources used in the paper. - Subtask (C): Extraction of relationships between two identified keyphrases - Every pair of keyphrases need to be labelled by one of three types: (i) HYPONYM-OF, (ii) SYNONYM-OF, and (iii) NONE. - HYPONYM-OF: The relationship between two keyphrases A and B is HYPONYM-OF if semantic field of A is included within that of B. One example is Red HYPONYM-OF Color. - SYNONYM-OF: The relationship between two keyphrases A and B is SYNONYM-OF if they both denote the same semantic field, for example Machine Learning SYNONYM-OF ML. Note: In this repository the documents were split into sentences using spaCy, resulting in a 2388, 400, 838 split. The `id` consists of the document id and the example index within the document separated by an underscore, e.g. `S0375960115004120_1`. This should enable you to reconstruct the documents from the sentences. ### Supported Tasks and Leaderboards - **Tasks:** Key phrase extraction and relation extraction in scientific documents - **Leaderboards:** [https://competitions.codalab.org/competitions/15898](https://competitions.codalab.org/competitions/15898) ### Languages The language in the dataset is English. ## Dataset Structure ### Data Instances #### subtask_a - **Size of downloaded dataset files:** 13.7 MB - **Size of the generated dataset:** 17.4 MB An example of 'train' looks as follows: ```json { "id": "S0375960115004120_1", "tokens": ["Another", "remarkable", "feature", "of", "the", "quantum", "field", "treatment", "can", "be", "revealed", "from", "the", "investigation", "of", "the", "vacuum", "state", "."], "tags": [0, 0, 0, 0, 0, 1, 2, 2, 0, 0, 0, 0, 0, 0, 0, 0, 1, 2, 0] } ``` #### subtask_b - **Size of downloaded dataset files:** 13.7 MB - **Size of the generated dataset:** 17.4 MB An example of 'train' looks as follows: ```json { "id": "S0375960115004120_2", "tokens": ["For", "a", "classical", "field", ",", "vacuum", "is", "realized", "by", "simply", "setting", "the", "potential", "to", "zero", "resulting", "in", "an", "unaltered", ",", "free", "evolution", "of", "the", "particle", "'s", "plane", "wave", "(", "|ψI〉=|ψIII〉=|k0", "〉", ")", "."], "tags": [0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 0, 0, 0, 0, 0] } ``` #### subtask_c - **Size of downloaded dataset files:** 13.7 MB - **Size of the generated dataset:** 30.1 MB An example of 'train' looks as follows: ```json { "id": "S0375960115004120_3", "tokens": ["In", "the", "quantized", "treatment", ",", "vacuum", "is", "represented", "by", "an", "initial", "Fock", "state", "|n0=0", "〉", "which", "still", "interacts", "with", "the", "particle", "and", "yields", "as", "final", "state", "|ΨIII", "〉", "behind", "the", "field", "region(19)|ΨI〉=|k0〉⊗|0〉⇒|ΨIII〉=∑n=0∞t0n|k−n〉⊗|n", "〉", "with", "a", "photon", "exchange", "probability(20)P0,n=|t0n|2=1n!e−Λ2Λ2n", "The", "particle", "thus", "transfers", "energy", "to", "the", "vacuum", "field", "leading", "to", "a", "Poissonian", "distributed", "final", "photon", "number", "."], "tags": [[0, 0, ...], [0, 0, ...], ...] } ``` Note: The tag sequence consists of vectors for each token, that encode what the relationship between that token and every other token in the sequence is for the first token in each key phrase. #### ner - **Size of downloaded dataset files:** 13.7 MB - **Size of the generated dataset:** 17.4 MB An example of 'train' looks as follows: ```json { "id": "S0375960115004120_4", "tokens": ["Let", "'s", "consider", ",", "for", "example", ",", "a", "superconducting", "resonant", "circuit", "as", "source", "of", "the", "field", "."], "tags": [0, 0, 0, 0, 0, 0, 0, 0, 1, 2, 2, 0, 0, 0, 0, 0, 0] } ``` #### re - **Size of downloaded dataset files:** 13.7 MB - **Size of the generated dataset:** 16.4 MB An example of 'train' looks as follows: ```json { "id": "S0375960115004120_5", "tokens": ["In", "the", "quantized", "treatment", ",", "vacuum", "is", "represented", "by", "an", "initial", "Fock", "state", "|n0=0", "〉", "which", "still", "interacts", "with", "the", "particle", "and", "yields", "as", "final", "state", "|ΨIII", "〉", "behind", "the", "field", "region(19)|ΨI〉=|k0〉⊗|0〉⇒|ΨIII〉=∑n=0∞t0n|k−n〉⊗|n", "〉", "with", "a", "photon", "exchange", "probability(20)P0,n=|t0n|2=1n!e−Λ2Λ2n", "The", "particle", "thus", "transfers", "energy", "to", "the", "vacuum", "field", "leading", "to", "a", "Poissonian", "distributed", "final", "photon", "number", "."], "arg1_start": 2, "arg1_end": 4, "arg1_type": "Task", "arg2_start": 5, "arg2_end": 6, "arg2_type": "Material", "relation": 0 } ``` ### Data Fields #### subtask_a - `id`: the instance id of this sentence, a `string` feature. - `tokens`: the list of tokens of this sentence, obtained with spaCy, a `list` of `string` features. - `tags`: the list of tags of this sentence marking a token as being outside, at the beginning, or inside a key phrase, a `list` of classification labels. ```python {"O": 0, "B": 1, "I": 2} ``` #### subtask_b - `id`: the instance id of this sentence, a `string` feature. - `tokens`: the list of tokens of this sentence, obtained with spaCy, a `list` of `string` features. - `tags`: the list of tags of this sentence marking a token as being outside a key phrase, or being part of a material, process or task, a `list` of classification labels. ```python {"O": 0, "M": 1, "P": 2, "T": 3} ``` #### subtask_c - `id`: the instance id of this sentence, a `string` feature. - `tokens`: the list of tokens of this sentence, obtained with spaCy, a `list` of `string` features. - `tags`: a vector for each token, that encodes what the relationship between that token and every other token in the sequence is for the first token in each key phrase, a `list` of a `list` of a classification label. ```python {"O": 0, "S": 1, "H": 2} ``` #### ner - `id`: the instance id of this sentence, a `string` feature. - `tokens`: the list of tokens of this sentence, obtained with spaCy, a `list` of `string` features. - `tags`: the list of ner tags of this sentence, a `list` of classification labels. ```python {"O": 0, "B-Material": 1, "I-Material": 2, "B-Process": 3, "I-Process": 4, "B-Task": 5, "I-Task": 6} ``` #### re - `id`: the instance id of this sentence, a `string` feature. - `token`: the list of tokens of this sentence, obtained with spaCy, a `list` of `string` features. - `arg1_start`: the 0-based index of the start token of the relation arg1 mention, an `ìnt` feature. - `arg1_end`: the 0-based index of the end token of the relation arg1 mention, exclusive, an `ìnt` feature. - `arg1_type`: the key phrase type of the end token of the relation arg1 mention, a `string` feature. - `arg2_start`: the 0-based index of the start token of the relation arg2 mention, an `ìnt` feature. - `arg2_end`: the 0-based index of the end token of the relation arg2 mention, exclusive, an `ìnt` feature. - `arg2_type`: the key phrase type of the relation arg2 mention, a `string` feature. - `relation`: the relation label of this instance, a classification label. ```python {"O": 0, "Synonym-of": 1, "Hyponym-of": 2} ``` ### Data Splits | | Train | Dev | Test | |-----------|-------|------|------| | subtask_a | 2388 | 400 | 838 | | subtask_b | 2388 | 400 | 838 | | subtask_c | 2388 | 400 | 838 | | ner | 2388 | 400 | 838 | | re | 24558 | 4838 | 6618 | ## Dataset Creation ### Curation Rationale [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Source Data #### Initial Data Collection and Normalization [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the source language producers? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Annotations #### Annotation process [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) #### Who are the annotators? [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Personal and Sensitive Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Discussion of Biases [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Other Known Limitations [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ## Additional Information ### Dataset Curators [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Licensing Information [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) ### Citation Information ``` @article{DBLP:journals/corr/AugensteinDRVM17, author = {Isabelle Augenstein and Mrinal Das and Sebastian Riedel and Lakshmi Vikraman and Andrew McCallum}, title = {SemEval 2017 Task 10: ScienceIE - Extracting Keyphrases and Relations from Scientific Publications}, journal = {CoRR}, volume = {abs/1704.02853}, year = {2017}, url = {http://arxiv.org/abs/1704.02853}, eprinttype = {arXiv}, eprint = {1704.02853}, timestamp = {Mon, 13 Aug 2018 16:46:36 +0200}, biburl = {https://dblp.org/rec/journals/corr/AugensteinDRVM17.bib}, bibsource = {dblp computer science bibliography, https://dblp.org} } ``` ### Contributions Thanks to [@phucdev](https://github.com/phucdev) for adding this dataset.
usvsnsp/temp-perplexities
--- dataset_info: features: - name: index dtype: int32 - name: loss dtype: float32 - name: prompt_perplexity dtype: float32 - name: generation_perplexity dtype: float32 - name: sequence_perplexity dtype: float32 splits: - name: pile.duped.6.9b num_bytes: 100000000 num_examples: 5000000 - name: memories.duped.6.9b num_bytes: 42419520 num_examples: 2120976 - name: memories.duped.12b num_bytes: 47646560 num_examples: 2382328 - name: memories.deduped.12b num_bytes: 37424320 num_examples: 1871216 download_size: 255603395 dataset_size: 227490400 --- # Dataset Card for "temp-perplexities" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ibranze/araproje_mmlu_tr_conf2
--- dataset_info: features: - name: question dtype: string - name: subject dtype: string - name: choices sequence: string - name: answer dtype: class_label: names: '0': A '1': B '2': C '3': D splits: - name: validation num_bytes: 137404.0 num_examples: 250 download_size: 82743 dataset_size: 137404.0 configs: - config_name: default data_files: - split: validation path: data/validation-* --- # Dataset Card for "araproje_mmlu_tr_conf2" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
enoahjr/twitter_dataset_1713193674
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 - name: scraped_at dtype: string - name: image_urls dtype: string splits: - name: train num_bytes: 750063 num_examples: 2261 download_size: 402675 dataset_size: 750063 configs: - config_name: default data_files: - split: train path: data/train-* ---
CyberHarem/mitsuki_felicia_puellamagimadokamagicasidestorymagiarecord
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of Mitsuki Felicia This is the dataset of Mitsuki Felicia, containing 151 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). | Name | Images | Download | Description | |:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------| | raw | 151 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 364 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | 384x512 | 151 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x512 | 151 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. | | 512x704 | 151 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x640 | 151 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. | | 640x880 | 151 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 364 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 364 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-1200 | 364 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
yangyz1230/H4_not_filtered
--- dataset_info: features: - name: name dtype: string - name: sequence dtype: string - name: chrom dtype: string - name: start dtype: int64 - name: end dtype: int64 - name: strand dtype: string - name: label dtype: int64 splits: - name: train num_bytes: 311017 num_examples: 566 - name: test num_bytes: 38234 num_examples: 70 download_size: 171903 dataset_size: 349251 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
unicamp-dl/quati
--- license: cc-by-4.0 task_categories: - question-answering language: - pt size_categories: - 1M<n<10M --- # Quati Information Retrieval Dataset Quati dataset aims to support Brazilian Portuguese (pt-br) Information Retrieval (IR) systems development, providing document passagens originally created in pt-br, as well as queries (topics) created by native speakers. ## License Quati is licensed under [Creative Commons Attribution 4.0 International (CC BY 4.0)](https://creativecommons.org/licenses/by/4.0/). ## Citation Information ``` @misc{bueno2024quati, title={Quati: A Brazilian Portuguese Information Retrieval Dataset from Native Speakers}, author={Mirelle Bueno and Eduardo Seiti de Oliveira and Rodrigo Nogueira and Roberto A. Lotufo and Jayr Alencar Pereira}, year={2024}, eprint={2404.06976}, archivePrefix={arXiv}, primaryClass={cs.IR} } ``` ## About Quati and how to use Quati dataset is currently released in two versions: one with 1 million passages, and a larger one with 10 million passages. So far we have prepared only validation *qrels* for both versions, annotating 50 topics with an average of 97.78 passages per query on the 10M version, and 38.66 passages per query on the 1M version. Quati can be used to evaluate any Information Retrieval system target Brazilian Portuguese Language documents. The dataset creation and annotation pipeline can also be used to further expand the passages annotation, or to create other IR datasets targeting specific Languages. Please refer to [our publication](https://arxiv.org/abs/2404.06976) for further details about the dataset and its creation process. ### Obtaining the 1M dataset version ```python >>> quati_1M_passages = load_dataset("unicamp-dl/quati", "quati_1M_passages") >>> quati_1M_passages['quati_1M_passages'][0] {'passage_id': 'clueweb22-pt0000-00-00003_1', 'passage': 'Se você precisar de ajuda, visite o website nacional sobre a COVID-19 ou ligue para a linha de apoio à COVID-19 808 24 24 24 Perguntas mais frequentes Posso viajar entre Sintra e Cascais? Quais são as restrições de viagem em Cascais? Qual o número de telefone de apoio para a COVID 19 em Cascais? Preciso utilizar máscara facial no transporte público em Cascais? A prática do distanciamento social é compulsória em Cascais? O que eu devo fazer caso apresente sintomas da COVID-19 quando chegar em Cascais? Última atualização: 25 Abr 2022 Aplicam-se exceções, para detalhes completos: European Union. Estamos trabalhando ininterruptamente para lhe trazer as últimas informações de viagem relacionadas à COVID-19. Esta informação é compilada a partir de fontes oficiais. Ao melhor de nosso conhecimento, está correta de acordo com a última atualização. Visite Avisos de Viagem Rome2rio para ajuda geral. Perguntas & Respostas Qual a maneira mais econômica de ir de Sintra para Cascais? Qual a maneira mais rápida de ir de Sintra para Cascais? Existe um ônibus direto entre Sintra e Cascais? Qual a distância entre Sintra e Cascais? Como viajar de Sintra para Cascais sem carro? Qual a distância entre Sintra e Cascais? Aonde pegar ônibus de Sintra para Cascais?'} ``` ### Obtaining the complete 10M dataset version ```python >>> quati_10M_all = load_dataset("unicamp-dl/quati", "quati_10M_passages") >>> quati_10M_all DatasetDict({ quati_10M_passages_part_00: Dataset({ features: ['passage_id', 'passage'], num_rows: 2000000 }) quati_10M_passages_part_01: Dataset({ features: ['passage_id', 'passage'], num_rows: 2000000 }) quati_10M_passages_part_02: Dataset({ features: ['passage_id', 'passage'], num_rows: 2000000 }) quati_10M_passages_part_03: Dataset({ features: ['passage_id', 'passage'], num_rows: 2000000 }) quati_10M_passages_part_04: Dataset({ features: ['passage_id', 'passage'], num_rows: 2000000 }) }) ``` It is possible to download only single parts of the 10M dataset directly passing the corresponding configuration name ― e.g. `quati_10M_passages_part_02`. ### Obtaining the qrels ```python >>> quati_1M_qrels = load_dataset("unicamp-dl/quati", "quati_1M_qrels") >>> quati_1M_qrels['quati_1M_qrels'][0] {'query_id': 1, 'passage_id': 'clueweb22-pt0000-78-09747_0', 'score': 1} ``` ```python >>> quati_10M_qrels = load_dataset("unicamp-dl/quati", "quati_10M_qrels") >>> quati_10M_qrels['quati_10M_qrels'][2560] {'query_id': 105, 'passage_id': 'clueweb22-pt0001-93-08394_4', 'score': 0} ``` ### Obtaining the topics ```python >>> quati_all_topics = load_dataset("unicamp-dl/quati", "quati_all_topics") >>> quati_all_topics['quati_all_topics'][-1] {'query_id': 199, 'query': 'Quais foram os principais artistas do Impressionismo?'} ``` ```python >>> quati_test_topics = load_dataset("unicamp-dl/quati", "quati_test_topics") >>> quati_test_topics['quati_test_topics'][0] {'query_id': 1, 'query': 'Qual a maior característica da fauna brasileira?'} ``` ### Cloning the dataset repository from Hugging Face It is possible to obtain the entire dataset cloning it git repository from Hugging Face Hub: ``` git clone git@hf.co:datasets/unicamp-dl/quati quati_dataset ```
open-llm-leaderboard/details_msy127__mnsim-dpo-peftmerged-2-eos
--- pretty_name: Evaluation run of msy127/mnsim-dpo-peftmerged-2-eos dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [msy127/mnsim-dpo-peftmerged-2-eos](https://huggingface.co/msy127/mnsim-dpo-peftmerged-2-eos)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_msy127__mnsim-dpo-peftmerged-2-eos\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-02-01T22:52:39.126509](https://huggingface.co/datasets/open-llm-leaderboard/details_msy127__mnsim-dpo-peftmerged-2-eos/blob/main/results_2024-02-01T22-52-39.126509.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5126345853384686,\n\ \ \"acc_stderr\": 0.034203989562530804,\n \"acc_norm\": 0.5177735746052927,\n\ \ \"acc_norm_stderr\": 0.03497190870754416,\n \"mc1\": 0.31701346389228885,\n\ \ \"mc1_stderr\": 0.016289203374403392,\n \"mc2\": 0.4637339879535235,\n\ \ \"mc2_stderr\": 0.014647929379084504\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.5426621160409556,\n \"acc_stderr\": 0.014558106543924065,\n\ \ \"acc_norm\": 0.5563139931740614,\n \"acc_norm_stderr\": 0.014518421825670445\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5824536944831706,\n\ \ \"acc_stderr\": 0.004921466591335048,\n \"acc_norm\": 0.77823142800239,\n\ \ \"acc_norm_stderr\": 0.0041458720916152155\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\ \ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\ \ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\ \ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04068942293855797,\n \ \ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04068942293855797\n },\n\ \ \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n \ \ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \ \ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.03019761160019795,\n\ \ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.03019761160019795\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5208333333333334,\n\ \ \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.5208333333333334,\n\ \ \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\ \ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5086705202312138,\n\ \ \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.5086705202312138,\n\ \ \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617749,\n\ \ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617749\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\ \ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101735,\n \ \ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\ \ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\ \ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\ \ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n\ \ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523874,\n \"\ acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523874\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\ \ \"acc_stderr\": 0.04240799327574925,\n \"acc_norm\": 0.3412698412698413,\n\ \ \"acc_norm_stderr\": 0.04240799327574925\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5838709677419355,\n\ \ \"acc_stderr\": 0.028040981380761533,\n \"acc_norm\": 0.5838709677419355,\n\ \ \"acc_norm_stderr\": 0.028040981380761533\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\ \ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\ : 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.03781887353205982,\n\ \ \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.03781887353205982\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"\ acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.7616580310880829,\n \"acc_stderr\": 0.03074890536390988,\n\ \ \"acc_norm\": 0.7616580310880829,\n \"acc_norm_stderr\": 0.03074890536390988\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.5230769230769231,\n \"acc_stderr\": 0.025323990861736236,\n\ \ \"acc_norm\": 0.5230769230769231,\n \"acc_norm_stderr\": 0.025323990861736236\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \ \ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.03243718055137411,\n \ \ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.03243718055137411\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\ acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.710091743119266,\n \"acc_stderr\": 0.019453066609201597,\n \"\ acc_norm\": 0.710091743119266,\n \"acc_norm_stderr\": 0.019453066609201597\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4027777777777778,\n \"acc_stderr\": 0.033448873829978666,\n \"\ acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.033448873829978666\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.6617647058823529,\n \"acc_stderr\": 0.03320574612945431,\n \"\ acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.03320574612945431\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.70042194092827,\n \"acc_stderr\": 0.02981802474975309,\n \ \ \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.02981802474975309\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\ \ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\ \ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\ \ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884122,\n \"\ acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884122\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n\ \ \"acc_stderr\": 0.046166311118017125,\n \"acc_norm\": 0.6481481481481481,\n\ \ \"acc_norm_stderr\": 0.046166311118017125\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.5766871165644172,\n \"acc_stderr\": 0.03881891213334384,\n\ \ \"acc_norm\": 0.5766871165644172,\n \"acc_norm_stderr\": 0.03881891213334384\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\ \ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\ \ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.0482572933735639,\n\ \ \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.0482572933735639\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7393162393162394,\n\ \ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.7393162393162394,\n\ \ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \ \ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6807151979565773,\n\ \ \"acc_stderr\": 0.01667126174953871,\n \"acc_norm\": 0.6807151979565773,\n\ \ \"acc_norm_stderr\": 0.01667126174953871\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.6098265895953757,\n \"acc_stderr\": 0.026261677607806642,\n\ \ \"acc_norm\": 0.6098265895953757,\n \"acc_norm_stderr\": 0.026261677607806642\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\ \ \"acc_stderr\": 0.014465893829859936,\n \"acc_norm\": 0.24916201117318434,\n\ \ \"acc_norm_stderr\": 0.014465893829859936\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.545751633986928,\n \"acc_stderr\": 0.028509807802626595,\n\ \ \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.028509807802626595\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6077170418006431,\n\ \ \"acc_stderr\": 0.027731258647011998,\n \"acc_norm\": 0.6077170418006431,\n\ \ \"acc_norm_stderr\": 0.027731258647011998\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.5802469135802469,\n \"acc_stderr\": 0.027460099557005135,\n\ \ \"acc_norm\": 0.5802469135802469,\n \"acc_norm_stderr\": 0.027460099557005135\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704725,\n \ \ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704725\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43089960886571055,\n\ \ \"acc_stderr\": 0.012647695889547228,\n \"acc_norm\": 0.43089960886571055,\n\ \ \"acc_norm_stderr\": 0.012647695889547228\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5367647058823529,\n \"acc_stderr\": 0.03029061918048569,\n\ \ \"acc_norm\": 0.5367647058823529,\n \"acc_norm_stderr\": 0.03029061918048569\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5196078431372549,\n \"acc_stderr\": 0.020212274976302957,\n \ \ \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.020212274976302957\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\ \ \"acc_stderr\": 0.04631381319425464,\n \"acc_norm\": 0.6272727272727273,\n\ \ \"acc_norm_stderr\": 0.04631381319425464\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.5102040816326531,\n \"acc_stderr\": 0.03200255347893783,\n\ \ \"acc_norm\": 0.5102040816326531,\n \"acc_norm_stderr\": 0.03200255347893783\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n\ \ \"acc_stderr\": 0.032510068164586195,\n \"acc_norm\": 0.6965174129353234,\n\ \ \"acc_norm_stderr\": 0.032510068164586195\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421296,\n \ \ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421296\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\ \ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\ \ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n\ \ \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31701346389228885,\n\ \ \"mc1_stderr\": 0.016289203374403392,\n \"mc2\": 0.4637339879535235,\n\ \ \"mc2_stderr\": 0.014647929379084504\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803159\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.16906747536012132,\n \ \ \"acc_stderr\": 0.010324171445497347\n }\n}\n```" repo_url: https://huggingface.co/msy127/mnsim-dpo-peftmerged-2-eos leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|arc:challenge|25_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-02-01T22-52-39.126509.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|gsm8k|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hellaswag|10_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-01T22-52-39.126509.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-management|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-virology|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T22-52-39.126509.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|truthfulqa:mc|0_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-02-01T22-52-39.126509.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_02_01T22_52_39.126509 path: - '**/details_harness|winogrande|5_2024-02-01T22-52-39.126509.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-02-01T22-52-39.126509.parquet' - config_name: results data_files: - split: 2024_02_01T22_52_39.126509 path: - results_2024-02-01T22-52-39.126509.parquet - split: latest path: - results_2024-02-01T22-52-39.126509.parquet --- # Dataset Card for Evaluation run of msy127/mnsim-dpo-peftmerged-2-eos <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [msy127/mnsim-dpo-peftmerged-2-eos](https://huggingface.co/msy127/mnsim-dpo-peftmerged-2-eos) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_msy127__mnsim-dpo-peftmerged-2-eos", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-01T22:52:39.126509](https://huggingface.co/datasets/open-llm-leaderboard/details_msy127__mnsim-dpo-peftmerged-2-eos/blob/main/results_2024-02-01T22-52-39.126509.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5126345853384686, "acc_stderr": 0.034203989562530804, "acc_norm": 0.5177735746052927, "acc_norm_stderr": 0.03497190870754416, "mc1": 0.31701346389228885, "mc1_stderr": 0.016289203374403392, "mc2": 0.4637339879535235, "mc2_stderr": 0.014647929379084504 }, "harness|arc:challenge|25": { "acc": 0.5426621160409556, "acc_stderr": 0.014558106543924065, "acc_norm": 0.5563139931740614, "acc_norm_stderr": 0.014518421825670445 }, "harness|hellaswag|10": { "acc": 0.5824536944831706, "acc_stderr": 0.004921466591335048, "acc_norm": 0.77823142800239, "acc_norm_stderr": 0.0041458720916152155 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.04605661864718381, "acc_norm": 0.3, "acc_norm_stderr": 0.04605661864718381 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.4740740740740741, "acc_stderr": 0.04313531696750574, "acc_norm": 0.4740740740740741, "acc_norm_stderr": 0.04313531696750574 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5, "acc_stderr": 0.04068942293855797, "acc_norm": 0.5, "acc_norm_stderr": 0.04068942293855797 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.47, "acc_stderr": 0.050161355804659205, "acc_norm": 0.47, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5962264150943396, "acc_stderr": 0.03019761160019795, "acc_norm": 0.5962264150943396, "acc_norm_stderr": 0.03019761160019795 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5208333333333334, "acc_stderr": 0.041775789507399935, "acc_norm": 0.5208333333333334, "acc_norm_stderr": 0.041775789507399935 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.45, "acc_stderr": 0.049999999999999996, "acc_norm": 0.45, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.5086705202312138, "acc_stderr": 0.03811890988940412, "acc_norm": 0.5086705202312138, "acc_norm_stderr": 0.03811890988940412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.20588235294117646, "acc_stderr": 0.04023382273617749, "acc_norm": 0.20588235294117646, "acc_norm_stderr": 0.04023382273617749 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.61, "acc_stderr": 0.04902071300001975, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001975 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.4, "acc_stderr": 0.03202563076101735, "acc_norm": 0.4, "acc_norm_stderr": 0.03202563076101735 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2543859649122807, "acc_stderr": 0.040969851398436716, "acc_norm": 0.2543859649122807, "acc_norm_stderr": 0.040969851398436716 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.4413793103448276, "acc_stderr": 0.04137931034482758, "acc_norm": 0.4413793103448276, "acc_norm_stderr": 0.04137931034482758 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.30952380952380953, "acc_stderr": 0.023809523809523874, "acc_norm": 0.30952380952380953, "acc_norm_stderr": 0.023809523809523874 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.3412698412698413, "acc_stderr": 0.04240799327574925, "acc_norm": 0.3412698412698413, "acc_norm_stderr": 0.04240799327574925 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.04760952285695235, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695235 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5838709677419355, "acc_stderr": 0.028040981380761533, "acc_norm": 0.5838709677419355, "acc_norm_stderr": 0.028040981380761533 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.458128078817734, "acc_stderr": 0.03505630140785741, "acc_norm": 0.458128078817734, "acc_norm_stderr": 0.03505630140785741 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.44, "acc_stderr": 0.04988876515698589, "acc_norm": 0.44, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6242424242424243, "acc_stderr": 0.03781887353205982, "acc_norm": 0.6242424242424243, "acc_norm_stderr": 0.03781887353205982 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.6868686868686869, "acc_stderr": 0.033042050878136525, "acc_norm": 0.6868686868686869, "acc_norm_stderr": 0.033042050878136525 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7616580310880829, "acc_stderr": 0.03074890536390988, "acc_norm": 0.7616580310880829, "acc_norm_stderr": 0.03074890536390988 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.5230769230769231, "acc_stderr": 0.025323990861736236, "acc_norm": 0.5230769230769231, "acc_norm_stderr": 0.025323990861736236 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.26296296296296295, "acc_stderr": 0.026842057873833706, "acc_norm": 0.26296296296296295, "acc_norm_stderr": 0.026842057873833706 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.5252100840336135, "acc_stderr": 0.03243718055137411, "acc_norm": 0.5252100840336135, "acc_norm_stderr": 0.03243718055137411 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.03879687024073327, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.03879687024073327 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.710091743119266, "acc_stderr": 0.019453066609201597, "acc_norm": 0.710091743119266, "acc_norm_stderr": 0.019453066609201597 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4027777777777778, "acc_stderr": 0.033448873829978666, "acc_norm": 0.4027777777777778, "acc_norm_stderr": 0.033448873829978666 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.6617647058823529, "acc_stderr": 0.03320574612945431, "acc_norm": 0.6617647058823529, "acc_norm_stderr": 0.03320574612945431 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.70042194092827, "acc_stderr": 0.02981802474975309, "acc_norm": 0.70042194092827, "acc_norm_stderr": 0.02981802474975309 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6053811659192825, "acc_stderr": 0.03280400504755291, "acc_norm": 0.6053811659192825, "acc_norm_stderr": 0.03280400504755291 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.5954198473282443, "acc_stderr": 0.043046937953806645, "acc_norm": 0.5954198473282443, "acc_norm_stderr": 0.043046937953806645 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6942148760330579, "acc_stderr": 0.04205953933884122, "acc_norm": 0.6942148760330579, "acc_norm_stderr": 0.04205953933884122 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.6481481481481481, "acc_stderr": 0.046166311118017125, "acc_norm": 0.6481481481481481, "acc_norm_stderr": 0.046166311118017125 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5766871165644172, "acc_stderr": 0.03881891213334384, "acc_norm": 0.5766871165644172, "acc_norm_stderr": 0.03881891213334384 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.3125, "acc_stderr": 0.043994650575715215, "acc_norm": 0.3125, "acc_norm_stderr": 0.043994650575715215 }, "harness|hendrycksTest-management|5": { "acc": 0.6116504854368932, "acc_stderr": 0.0482572933735639, "acc_norm": 0.6116504854368932, "acc_norm_stderr": 0.0482572933735639 }, "harness|hendrycksTest-marketing|5": { "acc": 0.7393162393162394, "acc_stderr": 0.028760348956523414, "acc_norm": 0.7393162393162394, "acc_norm_stderr": 0.028760348956523414 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.5, "acc_stderr": 0.050251890762960605, "acc_norm": 0.5, "acc_norm_stderr": 0.050251890762960605 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.6807151979565773, "acc_stderr": 0.01667126174953871, "acc_norm": 0.6807151979565773, "acc_norm_stderr": 0.01667126174953871 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.6098265895953757, "acc_stderr": 0.026261677607806642, "acc_norm": 0.6098265895953757, "acc_norm_stderr": 0.026261677607806642 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24916201117318434, "acc_stderr": 0.014465893829859936, "acc_norm": 0.24916201117318434, "acc_norm_stderr": 0.014465893829859936 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.545751633986928, "acc_stderr": 0.028509807802626595, "acc_norm": 0.545751633986928, "acc_norm_stderr": 0.028509807802626595 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.6077170418006431, "acc_stderr": 0.027731258647011998, "acc_norm": 0.6077170418006431, "acc_norm_stderr": 0.027731258647011998 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5802469135802469, "acc_stderr": 0.027460099557005135, "acc_norm": 0.5802469135802469, "acc_norm_stderr": 0.027460099557005135 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.40425531914893614, "acc_stderr": 0.029275532159704725, "acc_norm": 0.40425531914893614, "acc_norm_stderr": 0.029275532159704725 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.43089960886571055, "acc_stderr": 0.012647695889547228, "acc_norm": 0.43089960886571055, "acc_norm_stderr": 0.012647695889547228 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5367647058823529, "acc_stderr": 0.03029061918048569, "acc_norm": 0.5367647058823529, "acc_norm_stderr": 0.03029061918048569 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5196078431372549, "acc_stderr": 0.020212274976302957, "acc_norm": 0.5196078431372549, "acc_norm_stderr": 0.020212274976302957 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6272727272727273, "acc_stderr": 0.04631381319425464, "acc_norm": 0.6272727272727273, "acc_norm_stderr": 0.04631381319425464 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.5102040816326531, "acc_stderr": 0.03200255347893783, "acc_norm": 0.5102040816326531, "acc_norm_stderr": 0.03200255347893783 }, "harness|hendrycksTest-sociology|5": { "acc": 0.6965174129353234, "acc_stderr": 0.032510068164586195, "acc_norm": 0.6965174129353234, "acc_norm_stderr": 0.032510068164586195 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.72, "acc_stderr": 0.045126085985421296, "acc_norm": 0.72, "acc_norm_stderr": 0.045126085985421296 }, "harness|hendrycksTest-virology|5": { "acc": 0.4457831325301205, "acc_stderr": 0.03869543323472101, "acc_norm": 0.4457831325301205, "acc_norm_stderr": 0.03869543323472101 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.6783625730994152, "acc_stderr": 0.03582529442573122, "acc_norm": 0.6783625730994152, "acc_norm_stderr": 0.03582529442573122 }, "harness|truthfulqa:mc|0": { "mc1": 0.31701346389228885, "mc1_stderr": 0.016289203374403392, "mc2": 0.4637339879535235, "mc2_stderr": 0.014647929379084504 }, "harness|winogrande|5": { "acc": 0.7624309392265194, "acc_stderr": 0.011961298905803159 }, "harness|gsm8k|5": { "acc": 0.16906747536012132, "acc_stderr": 0.010324171445497347 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
distilled-from-one-sec-cv12/chunk_14
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 1076662808 num_examples: 209794 download_size: 1097593758 dataset_size: 1076662808 --- # Dataset Card for "chunk_14" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
chenz16/curriculum_benchmark
--- license: mit ---
MMInstruction/ArxivCap
--- license: cc-by-sa-4.0 task_categories: - image-to-text language: - en pretty_name: ArxivCap size_categories: - 1M<n<10M tags: - arxiv - multi-modal --- # Dataset Card for ArxivCap ## Table of Contents - [Dataset Card for ArxivCap](#dataset-card-for-arxivcap) - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Curation Process](#curation-process) - [Dataset Structure](#dataset-structure) - [Data Loading](#data-loading) - [Data Fields](#data-fields) - [Data Instances](#data-instances) - [Additional Information](#additional-information) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) ## Dataset Description - **Paper:** [Multimodal ArXiv](https://arxiv.org/abs/2403.00231) - **Point of Contact:** nlp.lilei@gmail.com - **HomePage**: https://mm-arxiv.github.io/ ### Data Instances <details> <summary>Example-1 of single (image, caption) pairs</summary> "......" stands for omitted parts. ![example-1](images/example-1.png) ``` { 'src': 'arXiv_src_2112_060/2112.08947', 'meta': { 'meta_from_kaggle': { 'journey': '', 'license': 'http://arxiv.org/licenses/nonexclusive-distrib/1.0/', 'categories': 'cs.ET' }, 'meta_from_s2': { 'citationCount': 8, 'influentialCitationCount': 0, 'publicationTypes': ['JournalArticle'] } }, 'arxiv_id': '2112.08947', 'title': 'Computational metrics and parameters of an injection-locked large area semiconductor laser for neural network computing', 'abstract': 'Artificial neural networks have become a staple computing technique in many fields. Yet, they present fundamental differences with classical computing hardware in the way they process information. Photonic implementations of neural network architectures potentially offer fundamental advantages over their electronic counterparts in terms of speed, processing parallelism, scalability and energy efficiency. Scalable and high performance photonic neural networks (PNNs) have been demonstrated, yet they remain scarce. In this work, we study the performance of such a scalable, fully parallel and autonomous PNN based on a large area vertical-cavity surface-emitting laser\n(LA-VCSEL). We show how the performance varies with different physical parameters, namely, injection wavelength, injection power, and bias current. Furthermore, we link these physical parameters to the general computational measures of consistency and dimensionality. We present a general method of gauging dimensionality in high dimensional nonlinear systems subject to noise, which could be applied to many systems in the context of neuromorphic computing. Our work will inform future implementations of spatially multiplexed VCSEL PNNs.\n', 'caption_images': [ { 'caption': '(a) Working principle of the LA-VCSEL spatially multiplexed reservoir. (b) Input information $\\mathbf{u}$ and the subsequent LA-VCSEL response for 3-bit binary headers. The graph shows the target output $y^{\\text{target}}$ (yellow) for classifying header 001 and different reservoir outputs $y^{\\text{out}}$ of decreasing mean square error (MSE) (red, blue and green). (c) Schematic illustration of the error landscape, showing the MSE as a function of the output weights configuration. The outlined (red, blue and green) Boolean matrices correspond to the output weights giving the output from (b). (d) Representative performance of the PNN on a 6-bit header recognition task.', 'cil_pairs': [ { 'sub_caption': '', 'image_file': 'arXiv_src_2112_060/2112.08947_0.jpg', 'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=2016x1063 at 0x7F098E288040>, 'image_ocr': ['(a)', 'LA-VCSEL', 'DMDa', 'DMD', 'MMF', 'DET', 'Win', 'xt', 'Spatial positions', 'Output', 'Input', 'Wint', 'Carrier diffusion', 'Cavity diffraction', 'Reservoir', '(d)50', '6bit HR', 'Error(MSE)', '830', '001', '000', '001', '100', '001', '111', 'ER', 'S', '10', '0', 'Configuration DMD.', '0', '1000', 'Input examples', 'Learning epochs'] } ] } ...... ] } ``` </details> <details> <summary>Example-2 of multiple images and subcaptions</summary> "......" stands for omitted parts. ![example-2](images/example-2.png) ``` { 'src': 'arXiv_src_0309_001/quant-ph0309051', 'meta': { 'meta_from_kaggle': {'journey': '', 'license': '', 'categories': 'quant-ph'}, 'meta_from_s2': {'citationCount': 9, 'influentialCitationCount': 1, 'publicationTypes': ['JournalArticle']} }, 'arxiv_id': 'quant-ph/0309051', 'title': 'Implementing a Quantum Algorithm with Exchange-Coupled Quantum Dots: a Feasibility study.', 'abstract': '\nWe present Monte Carlo wavefunction simulations for quantum computations employing an exchange-coupled array of quantum dots. Employing a combination of experimentally and theoretically available parameters, we find that gate fidelities greater than 98 \\% may be obtained with current experimental and technological capabilities. Application to an encoded 3 qubit\n(nine physical qubits) Deutsch-Josza computation indicates that the algorithmic fidelity is more a question of the total time to implement the gates than of the physical complexity of those gates.\n', 'caption_images': [ ...... { 'caption': 'Representation of analytic sequence of local transformations that transform the 19-exchange sequence $U_{cnot}^{exchange}$ from Ref. {divincenzo00} into the true CNOT in the computational basis. The exchange gates and times corresponding to the elementary local transformations are then obtained using the quaternion representation of the desired $SU(2)$ unitaries (see Appendix <ref> for details).', 'cil_pairs': [ { 'sub_caption': 'A single qubit gate ($\\frac{\\sqrt{3}}{2}-\\frac{i}{2}\\sigma_y$) acting on the second logical qubit diagonalizes the 19-gate exchange sequence. The resulting diagonal 4-by-4 matrix is then converted into the C-PHASE by $\\sigma_z$-rotations acting on both the first and the second qubit, with angles $\\phi=0.612497$ and $\\theta=-0.547580$, respectively. These values are determined from the analytic solutions to a linear equation system with 3 unknowns: $\\phi$, $\\theta$ and a global phase. See Appendix <ref> for details as to how these parameters were obtained.', 'image_file': 'arXiv_src_0309_001/quant-ph0309051_4.jpg', 'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=2016x493 at 0x7F102471EF70>, 'image_ocr': ['Exch,', '7', 'C', '2', '+', '2', '2', 'CNOT', '2', '2', 'PHASE'] }, { 'sub_caption': 'The C-PHASE gate can be transformed into the CNOT gate by acting with Hadamard gates on the second qubit before and after the C-PHASE gate.', 'image_file': 'arXiv_src_0309_001/quant-ph0309051_5.jpg', 'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=2016x411 at 0x7F102471EDC0>, 'image_ocr': ['C', '2', 'PHASE'] } ] }, ...... ] } ``` </details> ### Dataset Summary The ArxivCap dataset consists of 6.4 million images and 3.9 million captions with 193 million words from 570k academic papers accompanied with abstracts and titles. (papers before **June 2023**) ### Curation Process Refer to our paper for the curation and filter process. ## Dataset Structure ### Data Loading ```python from datasets import load_dataset dataset = load_dataset("MMInstruction/ArxivCap") dataset["train"] # list of dictionaries ``` --- ```bash # for quick download in linux set -e sudo apt-get install git-lfs -y git clone https://huggingface.co/datasets/MMInstruction/ArxivCap cd ArxivCap/data ``` ```python # then you can load the parquet files in python use something like data = load_dataset( "parquet", data_files="/path/to/parquet/arXiv_src_9912_001.parquet" ) ``` ### Data Fields One record refers to one paper: - src: **String**. "\<Arxiv Tar File Name>/\<Folder Name in Tar File>"e.g. "arXiv_src_2112_060/2112.08947" - arxiv_id: **String**. Arxiv id of the paper, e.g. "2112.08947" - title: **String**. Title of the paper. - abstract: **String**. Abstract of the paper. - meta: - meta_from_kaggle: refers to [arXiv Dataset](https://www.kaggle.com/datasets/Cornell-University/arxiv) - journey: **String**. Information about the journal the paper was published in. - licence: **String**. License for the paper. - categories: **String**. Categories / tags in the ArXiv system. - meta_from_s2: refers to [SEMANTIC SCHOLAR](https://api.semanticscholar.org/api-docs/#tag/Paper-Data/operation/get_graph_get_paper) - citationCount: **Integer**. Total number of citations S2 has found for this paper - influentialCitationCount: **Integer**. Refers [here](https://www.semanticscholar.org/faq#influential-citations) - publicationTypes: **List[String]**. Journal Article, Conference, Review, etc. - caption_images: - caption: **String**. Main caption. - cil_pairs: - sub_caption: **String**. Subcaption for the image. - image_file: **String**. Unique file name for the image. - image: **PIL.Image.Image**. A PIL.Image.Image object containing the image. - image_ocr: **List[String]**. OCR result for the image using [PaddleOCR](https://github.com/PaddlePaddle/PaddleOCR) ```python import datasets features = datasets.Features( { "src": datasets.Value("string"), "arxiv_id": datasets.Value("string"), "title": datasets.Value("string"), "abstract": datasets.Value("string"), "meta": { "meta_from_kaggle": { "journey": datasets.Value("string"), "license": datasets.Value("string"), "categories": datasets.Value("string"), }, "meta_from_s2": { "citationCount": datasets.Value("int32"), "influentialCitationCount": datasets.Value("int32"), "publicationTypes": [datasets.Value("string")], } }, "caption_images": [{ "caption": datasets.Value("string"), "cil_pairs": [{ "sub_caption": datasets.Value("string"), "image_file": datasets.Value("string"), "image": datasets.Image(), "image_ocr": [datasets.Value("string")], }] }] } ) ``` ## Additional Information ### Licensing Information ArxivCap is released under [CC BY-NC-SA 4.0](http://creativecommons.org/licenses/by-nc-sa/4.0/). ### Citation Information ``` @misc{li2024multimodal, title={Multimodal ArXiv: A Dataset for Improving Scientific Comprehension of Large Vision-Language Models}, author={Lei Li and Yuqi Wang and Runxin Xu and Peiyi Wang and Xiachong Feng and Lingpeng Kong and Qi Liu}, year={2024}, eprint={2403.00231}, archivePrefix={arXiv}, primaryClass={cs.CV} } ```
stjarvie/question_to_sql_with_ddl_small
--- dataset_info: features: - name: question dtype: string - name: sql dtype: string - name: schema dtype: string splits: - name: train num_bytes: 1856 num_examples: 10 download_size: 3195 dataset_size: 1856 --- # Dataset Card for "question_to_sql_with_ddl_small" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CCAE/CCAE-Corpus
--- license: cc-by-nc-nd-4.0 task_categories: - text-classification - text-generation language: - en pretty_name: Colorful Candies Are Exciting size_categories: - 100K<n<1M --- # CCAE: A Corpus of Chinese-based Asian Englishes ## Dataset Description - **Repository:** https://github.com/jacklanda/CCAE - **Paper:** ### Dataset Summary Language models have been foundations in various scenarios of NLP applications, but it has not been well applied in language variety studies, even for the most popular language like English. This paper represents one of the few initial efforts to utilize the NLP technology in the paradigm of World Englishes, specifically in creating a multi-variety corpus for studying Asian Englishes. We present an overview of the CCAE — Corpus of Chinese-based Asian English, a suite of corpora comprising six Chinese-based Asian English varieties. It is based on 340 million tokens in 448 thousand web documents from six regions. The ontology of data would make the corpus a helpful resource with enormous research potential for Asian Englishes (especially for Chinese Englishes for which there has not been a publicly accessible corpus yet so far) and an ideal source for varietyspecific language modeling and downstream tasks, thus setting the stage for NLP-based World Englishes studies. And preliminary experiments on this corpus reveal the practical value of CCAE. ### Languages Six varieties in asian areas of English: CHE, HKE, MCE, TWE, MYE, SGE ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
audibeal/my-cool-dataset-4
--- configs: - config_name: train data_files: "train.csv" - config_name: dev data_files: "dev.csv" - config_name: test data_files: "test.csv" ---
abideen/lex-dpo
--- dataset_info: features: - name: input dtype: string - name: generation_model sequence: string - name: generation_prompt sequence: string - name: raw_generation_responses sequence: string - name: generations sequence: string splits: - name: train num_bytes: 30489228 num_examples: 20000 download_size: 15432398 dataset_size: 30489228 configs: - config_name: default data_files: - split: train path: data/train-* ---
LambdaTests/VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_30_10000000
--- dataset_info: features: - name: id dtype: int64 - name: response dtype: string splits: - name: train num_bytes: 192545 num_examples: 6699 download_size: 122913 dataset_size: 192545 --- # Dataset Card for "VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_30_10000000" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
mask-distilled-one-sec-cv12/chunk_271
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 921397400 num_examples: 180950 download_size: 937535145 dataset_size: 921397400 --- # Dataset Card for "chunk_271" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
sethapun/arithmetic_2md_1to500
--- dataset_info: features: - name: expression dtype: string - name: answer dtype: float64 - name: label dtype: class_label: names: '0': 'false' '1': 'true' splits: - name: train num_bytes: 61140 num_examples: 2000 - name: validation num_bytes: 12234 num_examples: 400 download_size: 34855 dataset_size: 73374 --- # Dataset Card for "arithmetic_2md_1to500" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Isaak-Carter/JOSIE_v928.16
--- dataset_info: features: - name: sample dtype: string splits: - name: train num_bytes: 6499831 num_examples: 2348 download_size: 3066207 dataset_size: 6499831 --- # Dataset Card for "JOSIE_v928.16" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_nnethercott__llava-v1.5-7b-hf-vicuna
--- pretty_name: Evaluation run of nnethercott/llava-v1.5-7b-hf-vicuna dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [nnethercott/llava-v1.5-7b-hf-vicuna](https://huggingface.co/nnethercott/llava-v1.5-7b-hf-vicuna)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nnethercott__llava-v1.5-7b-hf-vicuna\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-02-23T22:56:58.109084](https://huggingface.co/datasets/open-llm-leaderboard/details_nnethercott__llava-v1.5-7b-hf-vicuna/blob/main/results_2024-02-23T22-56-58.109084.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5145109813550153,\n\ \ \"acc_stderr\": 0.03407762208548462,\n \"acc_norm\": 0.5211185331079348,\n\ \ \"acc_norm_stderr\": 0.03483915715750226,\n \"mc1\": 0.3023255813953488,\n\ \ \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.45861084561836213,\n\ \ \"mc2_stderr\": 0.01545760404502721\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.48208191126279865,\n \"acc_stderr\": 0.014602005585490976,\n\ \ \"acc_norm\": 0.5264505119453925,\n \"acc_norm_stderr\": 0.014590931358120169\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5707030472017527,\n\ \ \"acc_stderr\": 0.004939642460172578,\n \"acc_norm\": 0.7609042023501295,\n\ \ \"acc_norm_stderr\": 0.004256596457810718\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\ \ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\ \ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.040657710025626036,\n\ \ \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.040657710025626036\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\ \ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \ \ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.5320754716981132,\n \"acc_stderr\": 0.030709486992556545,\n\ \ \"acc_norm\": 0.5320754716981132,\n \"acc_norm_stderr\": 0.030709486992556545\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n\ \ \"acc_stderr\": 0.04155319955593146,\n \"acc_norm\": 0.5555555555555556,\n\ \ \"acc_norm_stderr\": 0.04155319955593146\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\ \ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.41,\n\ \ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \ \ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \ \ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4913294797687861,\n\ \ \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.4913294797687861,\n\ \ \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03708284662416544,\n\ \ \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03708284662416544\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\ \ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108102,\n\ \ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108102\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\ \ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\ \ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.041227371113703316,\n\ \ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.041227371113703316\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.2857142857142857,\n \"acc_stderr\": 0.02326651221373057,\n \"\ acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02326651221373057\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\ \ \"acc_stderr\": 0.040406101782088394,\n \"acc_norm\": 0.2857142857142857,\n\ \ \"acc_norm_stderr\": 0.040406101782088394\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.5645161290322581,\n \"acc_stderr\": 0.028206225591502748,\n \"\ acc_norm\": 0.5645161290322581,\n \"acc_norm_stderr\": 0.028206225591502748\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.3793103448275862,\n \"acc_stderr\": 0.034139638059062345,\n \"\ acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.034139638059062345\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\ : 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\ \ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.5808080808080808,\n \"acc_stderr\": 0.03515520728670417,\n \"\ acc_norm\": 0.5808080808080808,\n \"acc_norm_stderr\": 0.03515520728670417\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.7305699481865285,\n \"acc_stderr\": 0.03201867122877794,\n\ \ \"acc_norm\": 0.7305699481865285,\n \"acc_norm_stderr\": 0.03201867122877794\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.4794871794871795,\n \"acc_stderr\": 0.02532966316348994,\n \ \ \"acc_norm\": 0.4794871794871795,\n \"acc_norm_stderr\": 0.02532966316348994\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \ \ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.49159663865546216,\n \"acc_stderr\": 0.032473902765696686,\n\ \ \"acc_norm\": 0.49159663865546216,\n \"acc_norm_stderr\": 0.032473902765696686\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\ acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.7155963302752294,\n \"acc_stderr\": 0.019342036587702588,\n \"\ acc_norm\": 0.7155963302752294,\n \"acc_norm_stderr\": 0.019342036587702588\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696043,\n \"\ acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696043\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.7303921568627451,\n \"acc_stderr\": 0.031145570659486782,\n \"\ acc_norm\": 0.7303921568627451,\n \"acc_norm_stderr\": 0.031145570659486782\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994927,\n \ \ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994927\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6098654708520179,\n\ \ \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.6098654708520179,\n\ \ \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969637,\n\ \ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969637\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068382,\n \"\ acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068382\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n\ \ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n\ \ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.5276073619631901,\n \"acc_stderr\": 0.0392237829061099,\n\ \ \"acc_norm\": 0.5276073619631901,\n \"acc_norm_stderr\": 0.0392237829061099\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\ \ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\ \ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280042,\n\ \ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280042\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\ \ \"acc_stderr\": 0.02466249684520981,\n \"acc_norm\": 0.8290598290598291,\n\ \ \"acc_norm_stderr\": 0.02466249684520981\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \ \ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.698595146871009,\n\ \ \"acc_stderr\": 0.016409091097268777,\n \"acc_norm\": 0.698595146871009,\n\ \ \"acc_norm_stderr\": 0.016409091097268777\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.5202312138728323,\n \"acc_stderr\": 0.026897049996382875,\n\ \ \"acc_norm\": 0.5202312138728323,\n \"acc_norm_stderr\": 0.026897049996382875\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\ \ \"acc_stderr\": 0.014465893829859926,\n \"acc_norm\": 0.24916201117318434,\n\ \ \"acc_norm_stderr\": 0.014465893829859926\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.028213504177824096,\n\ \ \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.028213504177824096\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n\ \ \"acc_stderr\": 0.02795048149440127,\n \"acc_norm\": 0.5884244372990354,\n\ \ \"acc_norm_stderr\": 0.02795048149440127\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.027339546640662734,\n\ \ \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.027339546640662734\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.3546099290780142,\n \"acc_stderr\": 0.02853865002887864,\n \ \ \"acc_norm\": 0.3546099290780142,\n \"acc_norm_stderr\": 0.02853865002887864\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.37353324641460234,\n\ \ \"acc_stderr\": 0.012354994823515267,\n \"acc_norm\": 0.37353324641460234,\n\ \ \"acc_norm_stderr\": 0.012354994823515267\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904612,\n\ \ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904612\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.5081699346405228,\n \"acc_stderr\": 0.020225134343057265,\n \ \ \"acc_norm\": 0.5081699346405228,\n \"acc_norm_stderr\": 0.020225134343057265\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\ \ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\ \ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.031067211262872478,\n\ \ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.031067211262872478\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n\ \ \"acc_stderr\": 0.03203841040213322,\n \"acc_norm\": 0.7114427860696517,\n\ \ \"acc_norm_stderr\": 0.03203841040213322\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \ \ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\ \ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\ \ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n\ \ \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3023255813953488,\n\ \ \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.45861084561836213,\n\ \ \"mc2_stderr\": 0.01545760404502721\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7205998421468035,\n \"acc_stderr\": 0.01261082653940467\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15314632297194844,\n \ \ \"acc_stderr\": 0.00991972815279147\n }\n}\n```" repo_url: https://huggingface.co/nnethercott/llava-v1.5-7b-hf-vicuna leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|arc:challenge|25_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-02-23T22-56-58.109084.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|gsm8k|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hellaswag|10_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-management|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-02-23T22-56-58.109084.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-management|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-virology|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T22-56-58.109084.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|truthfulqa:mc|0_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-02-23T22-56-58.109084.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_02_23T22_56_58.109084 path: - '**/details_harness|winogrande|5_2024-02-23T22-56-58.109084.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-02-23T22-56-58.109084.parquet' - config_name: results data_files: - split: 2024_02_23T22_56_58.109084 path: - results_2024-02-23T22-56-58.109084.parquet - split: latest path: - results_2024-02-23T22-56-58.109084.parquet --- # Dataset Card for Evaluation run of nnethercott/llava-v1.5-7b-hf-vicuna <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [nnethercott/llava-v1.5-7b-hf-vicuna](https://huggingface.co/nnethercott/llava-v1.5-7b-hf-vicuna) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_nnethercott__llava-v1.5-7b-hf-vicuna", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-02-23T22:56:58.109084](https://huggingface.co/datasets/open-llm-leaderboard/details_nnethercott__llava-v1.5-7b-hf-vicuna/blob/main/results_2024-02-23T22-56-58.109084.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.5145109813550153, "acc_stderr": 0.03407762208548462, "acc_norm": 0.5211185331079348, "acc_norm_stderr": 0.03483915715750226, "mc1": 0.3023255813953488, "mc1_stderr": 0.016077509266133026, "mc2": 0.45861084561836213, "mc2_stderr": 0.01545760404502721 }, "harness|arc:challenge|25": { "acc": 0.48208191126279865, "acc_stderr": 0.014602005585490976, "acc_norm": 0.5264505119453925, "acc_norm_stderr": 0.014590931358120169 }, "harness|hellaswag|10": { "acc": 0.5707030472017527, "acc_stderr": 0.004939642460172578, "acc_norm": 0.7609042023501295, "acc_norm_stderr": 0.004256596457810718 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.34, "acc_stderr": 0.04760952285695236, "acc_norm": 0.34, "acc_norm_stderr": 0.04760952285695236 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.5037037037037037, "acc_stderr": 0.04319223625811331, "acc_norm": 0.5037037037037037, "acc_norm_stderr": 0.04319223625811331 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.5197368421052632, "acc_stderr": 0.040657710025626036, "acc_norm": 0.5197368421052632, "acc_norm_stderr": 0.040657710025626036 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.55, "acc_stderr": 0.049999999999999996, "acc_norm": 0.55, "acc_norm_stderr": 0.049999999999999996 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.5320754716981132, "acc_stderr": 0.030709486992556545, "acc_norm": 0.5320754716981132, "acc_norm_stderr": 0.030709486992556545 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.5555555555555556, "acc_stderr": 0.04155319955593146, "acc_norm": 0.5555555555555556, "acc_norm_stderr": 0.04155319955593146 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.37, "acc_stderr": 0.048523658709391, "acc_norm": 0.37, "acc_norm_stderr": 0.048523658709391 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.41, "acc_stderr": 0.04943110704237102, "acc_norm": 0.41, "acc_norm_stderr": 0.04943110704237102 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.41, "acc_stderr": 0.049431107042371025, "acc_norm": 0.41, "acc_norm_stderr": 0.049431107042371025 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.4913294797687861, "acc_stderr": 0.03811890988940412, "acc_norm": 0.4913294797687861, "acc_norm_stderr": 0.03811890988940412 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.16666666666666666, "acc_stderr": 0.03708284662416544, "acc_norm": 0.16666666666666666, "acc_norm_stderr": 0.03708284662416544 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.66, "acc_stderr": 0.04760952285695237, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695237 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.43829787234042555, "acc_stderr": 0.03243618636108102, "acc_norm": 0.43829787234042555, "acc_norm_stderr": 0.03243618636108102 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.2807017543859649, "acc_stderr": 0.042270544512322, "acc_norm": 0.2807017543859649, "acc_norm_stderr": 0.042270544512322 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.42758620689655175, "acc_stderr": 0.041227371113703316, "acc_norm": 0.42758620689655175, "acc_norm_stderr": 0.041227371113703316 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2857142857142857, "acc_stderr": 0.02326651221373057, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.02326651221373057 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.2857142857142857, "acc_stderr": 0.040406101782088394, "acc_norm": 0.2857142857142857, "acc_norm_stderr": 0.040406101782088394 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.34, "acc_stderr": 0.047609522856952365, "acc_norm": 0.34, "acc_norm_stderr": 0.047609522856952365 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.5645161290322581, "acc_stderr": 0.028206225591502748, "acc_norm": 0.5645161290322581, "acc_norm_stderr": 0.028206225591502748 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.3793103448275862, "acc_stderr": 0.034139638059062345, "acc_norm": 0.3793103448275862, "acc_norm_stderr": 0.034139638059062345 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.49, "acc_stderr": 0.05024183937956912, "acc_norm": 0.49, "acc_norm_stderr": 0.05024183937956912 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.6848484848484848, "acc_stderr": 0.0362773057502241, "acc_norm": 0.6848484848484848, "acc_norm_stderr": 0.0362773057502241 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.5808080808080808, "acc_stderr": 0.03515520728670417, "acc_norm": 0.5808080808080808, "acc_norm_stderr": 0.03515520728670417 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.7305699481865285, "acc_stderr": 0.03201867122877794, "acc_norm": 0.7305699481865285, "acc_norm_stderr": 0.03201867122877794 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.4794871794871795, "acc_stderr": 0.02532966316348994, "acc_norm": 0.4794871794871795, "acc_norm_stderr": 0.02532966316348994 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.24814814814814815, "acc_stderr": 0.0263357394040558, "acc_norm": 0.24814814814814815, "acc_norm_stderr": 0.0263357394040558 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.49159663865546216, "acc_stderr": 0.032473902765696686, "acc_norm": 0.49159663865546216, "acc_norm_stderr": 0.032473902765696686 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33112582781456956, "acc_stderr": 0.038425817186598696, "acc_norm": 0.33112582781456956, "acc_norm_stderr": 0.038425817186598696 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.7155963302752294, "acc_stderr": 0.019342036587702588, "acc_norm": 0.7155963302752294, "acc_norm_stderr": 0.019342036587702588 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.4074074074074074, "acc_stderr": 0.03350991604696043, "acc_norm": 0.4074074074074074, "acc_norm_stderr": 0.03350991604696043 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.7303921568627451, "acc_stderr": 0.031145570659486782, "acc_norm": 0.7303921568627451, "acc_norm_stderr": 0.031145570659486782 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7383966244725738, "acc_stderr": 0.028609516716994927, "acc_norm": 0.7383966244725738, "acc_norm_stderr": 0.028609516716994927 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6098654708520179, "acc_stderr": 0.03273766725459157, "acc_norm": 0.6098654708520179, "acc_norm_stderr": 0.03273766725459157 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.6335877862595419, "acc_stderr": 0.04225875451969637, "acc_norm": 0.6335877862595419, "acc_norm_stderr": 0.04225875451969637 }, "harness|hendrycksTest-international_law|5": { "acc": 0.6115702479338843, "acc_stderr": 0.04449270350068382, "acc_norm": 0.6115702479338843, "acc_norm_stderr": 0.04449270350068382 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.5833333333333334, "acc_stderr": 0.04766075165356461, "acc_norm": 0.5833333333333334, "acc_norm_stderr": 0.04766075165356461 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.5276073619631901, "acc_stderr": 0.0392237829061099, "acc_norm": 0.5276073619631901, "acc_norm_stderr": 0.0392237829061099 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4732142857142857, "acc_stderr": 0.047389751192741546, "acc_norm": 0.4732142857142857, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.6796116504854369, "acc_stderr": 0.04620284082280042, "acc_norm": 0.6796116504854369, "acc_norm_stderr": 0.04620284082280042 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8290598290598291, "acc_stderr": 0.02466249684520981, "acc_norm": 0.8290598290598291, "acc_norm_stderr": 0.02466249684520981 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.61, "acc_stderr": 0.04902071300001974, "acc_norm": 0.61, "acc_norm_stderr": 0.04902071300001974 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.698595146871009, "acc_stderr": 0.016409091097268777, "acc_norm": 0.698595146871009, "acc_norm_stderr": 0.016409091097268777 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.5202312138728323, "acc_stderr": 0.026897049996382875, "acc_norm": 0.5202312138728323, "acc_norm_stderr": 0.026897049996382875 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.24916201117318434, "acc_stderr": 0.014465893829859926, "acc_norm": 0.24916201117318434, "acc_norm_stderr": 0.014465893829859926 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.5849673202614379, "acc_stderr": 0.028213504177824096, "acc_norm": 0.5849673202614379, "acc_norm_stderr": 0.028213504177824096 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.5884244372990354, "acc_stderr": 0.02795048149440127, "acc_norm": 0.5884244372990354, "acc_norm_stderr": 0.02795048149440127 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.5925925925925926, "acc_stderr": 0.027339546640662734, "acc_norm": 0.5925925925925926, "acc_norm_stderr": 0.027339546640662734 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.3546099290780142, "acc_stderr": 0.02853865002887864, "acc_norm": 0.3546099290780142, "acc_norm_stderr": 0.02853865002887864 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.37353324641460234, "acc_stderr": 0.012354994823515267, "acc_norm": 0.37353324641460234, "acc_norm_stderr": 0.012354994823515267 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.5147058823529411, "acc_stderr": 0.03035969707904612, "acc_norm": 0.5147058823529411, "acc_norm_stderr": 0.03035969707904612 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.5081699346405228, "acc_stderr": 0.020225134343057265, "acc_norm": 0.5081699346405228, "acc_norm_stderr": 0.020225134343057265 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.5363636363636364, "acc_stderr": 0.04776449162396197, "acc_norm": 0.5363636363636364, "acc_norm_stderr": 0.04776449162396197 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.6204081632653061, "acc_stderr": 0.031067211262872478, "acc_norm": 0.6204081632653061, "acc_norm_stderr": 0.031067211262872478 }, "harness|hendrycksTest-sociology|5": { "acc": 0.7114427860696517, "acc_stderr": 0.03203841040213322, "acc_norm": 0.7114427860696517, "acc_norm_stderr": 0.03203841040213322 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-virology|5": { "acc": 0.4397590361445783, "acc_stderr": 0.03864139923699121, "acc_norm": 0.4397590361445783, "acc_norm_stderr": 0.03864139923699121 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.7485380116959064, "acc_stderr": 0.033275044238468436, "acc_norm": 0.7485380116959064, "acc_norm_stderr": 0.033275044238468436 }, "harness|truthfulqa:mc|0": { "mc1": 0.3023255813953488, "mc1_stderr": 0.016077509266133026, "mc2": 0.45861084561836213, "mc2_stderr": 0.01545760404502721 }, "harness|winogrande|5": { "acc": 0.7205998421468035, "acc_stderr": 0.01261082653940467 }, "harness|gsm8k|5": { "acc": 0.15314632297194844, "acc_stderr": 0.00991972815279147 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
lmms-lab/HallusionBench
--- dataset_info: features: - name: category dtype: string - name: subcategory dtype: string - name: visual_input dtype: string - name: set_id dtype: string - name: figure_id dtype: string - name: sample_note dtype: string - name: question_id dtype: string - name: question dtype: string - name: gt_answer_details dtype: string - name: gt_answer dtype: string - name: filename dtype: string - name: image dtype: image splits: - name: image num_bytes: 431997264.0 num_examples: 951 - name: non_image num_bytes: 41136.0 num_examples: 178 download_size: 146553615 dataset_size: 432038400.0 configs: - config_name: default data_files: - split: image path: data/image-* - split: non_image path: data/non_image-* --- <p align="center" width="100%"> <img src="https://i.postimg.cc/g0QRgMVv/WX20240228-113337-2x.png" width="100%" height="80%"> </p> # Large-scale Multi-modality Models Evaluation Suite > Accelerating the development of large-scale multi-modality models (LMMs) with `lmms-eval` 🏠 [Homepage](https://lmms-lab.github.io/) | 📚 [Documentation](docs/README.md) | 🤗 [Huggingface Datasets](https://huggingface.co/lmms-lab) # This Dataset This is a formatted version of [HallusionBench](https://github.com/tianyi-lab/HallusionBench). It is used in our `lmms-eval` pipeline to allow for one-click evaluations of large multi-modality models. ``` @misc{guan2023hallusionbench, title={HallusionBench: An Advanced Diagnostic Suite for Entangled Language Hallucination & Visual Illusion in Large Vision-Language Models}, author={Tianrui Guan and Fuxiao Liu and Xiyang Wu and Ruiqi Xian and Zongxia Li and Xiaoyu Liu and Xijun Wang and Lichang Chen and Furong Huang and Yaser Yacoob and Dinesh Manocha and Tianyi Zhou}, year={2023}, eprint={2310.14566}, archivePrefix={arXiv}, primaryClass={cs.CV} } @misc{liu2023mitigating, title={Mitigating Hallucination in Large Multi-Modal Models via Robust Instruction Tuning}, author={Fuxiao Liu and Kevin Lin and Linjie Li and Jianfeng Wang and Yaser Yacoob and Lijuan Wang}, year={2023}, eprint={2306.14565}, archivePrefix={arXiv}, primaryClass={cs.CV} } @misc{liu2023mmc, title={MMC: Advancing Multimodal Chart Understanding with Large-scale Instruction Tuning}, author={Fuxiao Liu and Xiaoyang Wang and Wenlin Yao and Jianshu Chen and Kaiqiang Song and Sangwoo Cho and Yaser Yacoob and Dong Yu}, year={2023}, eprint={2311.10774}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
liuyanchen1015/MULTI_VALUE_qqp_plural_preposed
--- dataset_info: features: - name: question1 dtype: string - name: question2 dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: dev num_bytes: 1637630 num_examples: 9333 - name: test num_bytes: 16390540 num_examples: 93867 - name: train num_bytes: 14677543 num_examples: 83260 download_size: 20059460 dataset_size: 32705713 --- # Dataset Card for "MULTI_VALUE_qqp_plural_preposed" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
FreedomIntelligence/HuatuoGPT2_sft_instruct_GPT4_50K
--- license: apache-2.0 --- This dataset represents the supervised fine-tuning instructions for HuatuoGPT2, with all responses generated by GPT-4. We have made 50,000 entries of this dataset publicly available. - **Repository:** https://github.com/FreedomIntelligence/HuatuoGPT-II
ai4bharat/naamapadam
--- annotations_creators: - machine-generated language_creators: - machine-generated language: - as - bn - gu - hi - kn - ml - mr - or - pa - ta - te license: - cc0-1.0 multilinguality: - multilingual pretty_name: naamapadam size_categories: - 1M<n<10M source_datasets: - original task_categories: - token-classification task_ids: - named-entity-recognition --- # Dataset Card for naamapadam ## Table of Contents - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-instances) - [Data Splits](#data-instances) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) ## Dataset Description - **Homepage:** [Needs More Information] - **Repository:** https://github.com/AI4Bharat/indicner - **Paper:** [Needs More Information] - **Leaderboard:** [Needs More Information] - **Point of Contact:** Anoop Kunchukuttan ### Dataset Summary Naamapadam is the largest publicly available Named Entity Annotated dataset for 11 Indic languages. This corpora was created by projecting named entities from English side to the Indic language side of the English-Indic languages parallel corpus. The dataset additionally contains manually labelled test set for 8 Indic languages containing 500-1000 sentences. ### Supported Tasks and Leaderboards **Tasks:** NER on Indian languages. **Leaderboards:** Currently there is no Leaderboard for this dataset. ### Languages - `Assamese (as)` - `Bengali (bn)` - `Gujarati (gu)` - `Kannada (kn)` - `Hindi (hi)` - `Malayalam (ml)` - `Marathi (mr)` - `Oriya (or)` - `Punjabi (pa)` - `Tamil (ta)` - `Telugu (te)` ## Dataset Structure ### Data Instances {'words': ['उन्हेनें', 'शिकांगों','में','बोरोडिन','की','पत्नी','को','तथा','वाशिंगटन','में','रूसी','व्यापार','संघ','को','पैसे','भेजे','।'], 'ner': [0, 3, 0, 1, 0, 0, 0, 0, 3, 0, 5, 6, 6, 0, 0, 0, 0], } ### Data Fields - `words`: Raw tokens in the dataset. - `ner`: the NER tags for this dataset. ### Data Splits (to be updated, see paper for correct numbers) | Language | Train | Validation | Test | |---:|---:|---:|---:| | as | 10266 | 52 | 51 | | bn | 961679 | 4859 | 607 | | gu | 472845 | 2389 | 50 | | hi | 985787 | 13460 | 437 | | kn | 471763 | 2381 | 1019 | | ml | 716652 | 3618 | 974 | | mr | 455248 | 2300 | 1080 | | or | 196793 | 993 | 994 | | pa | 463534 | 2340 | 2342 | | ta | 497882 | 2795 | 49 | | te | 507741 | 2700 | 53 | ## Usage You should have the 'datasets' packages installed to be able to use the :rocket: HuggingFace datasets repository. Please use the following command and install via pip: ```code pip install datasets ``` To use the dataset, please use:<br/> ```python from datasets import load_dataset hiner = load_dataset('ai4bharat/naamapadam') ``` ## Dataset Creation We use the parallel corpus from the Samanantar Dataset between English and the 11 major Indian languages to create the NER dataset. We annotate the English portion of the parallel corpus with existing state-of-the-art NER model. We use word-level alignments learned from the parallel corpus to project the entity labels from English to the Indian language. ### Curation Rationale naamapadam was built from [Samanantar dataset](https://indicnlp.ai4bharat.org/samanantar/). This dataset was built for the task of Named Entity Recognition in Indic languages. The dataset was introduced to introduce new resources to the Indic languages language that was under-served for Natural Language Processing. ### Source Data [Samanantar dataset](https://indicnlp.ai4bharat.org/samanantar/) #### Initial Data Collection and Normalization [Needs More Information] #### Who are the source language producers? [Needs More Information] ### Annotations #### Annotation process NER annotations were done following the CoNLL-2003 guidelines. #### Who are the annotators? The annotations for the testset have been done by volunteers who are proficient in the respective languages. We would like to thank all the volunteers: - Anil Mhaske - Anoop Kunchukuttan - Archana Mhaske - Arnav Mhaske - Gowtham Ramesh - Harshit Kedia - Nitin Kedia - Rudramurthy V - Sangeeta Rajagopal - Sumanth Doddapaneni - Vindhya DS - Yash Madhani - Kabir Ahuja - Shallu Rani - Armin Virk ### Personal and Sensitive Information [Needs More Information] ## Considerations for Using the Data ### Social Impact of Dataset The purpose of this dataset is to provide a large-scale Named Entity Recognition dataset for Indic languages. Since the information (data points) has been obtained from public resources, we do not think there is a negative social impact in releasing this data. ### Discussion of Biases [Needs More Information] ### Other Known Limitations [Needs More Information] ## Additional Information ### Dataset Curators [Needs More Information] ### Licensing Information <!-- <a rel="license" float="left" href="http://creativecommons.org/publicdomain/zero/1.0/"> <img src="https://licensebuttons.net/p/zero/1.0/88x31.png" style="border-style: none;" alt="CC0" width="100" /> <img src="https://mirrors.creativecommons.org/presskit/buttons/88x31/png/by.png" style="border-style: none;" alt="CC-BY" width="100" href="http://creativecommons.org/publicdomain/zero/1.0/"/> </a> <br/> --> **CC0 License Statement** <a rel="license" float="left" href="https://creativecommons.org/about/cclicenses/"> <img src="https://licensebuttons.net/p/zero/1.0/88x31.png" style="border-style: none;" alt="CC0" width="100"/> </a> <br> <br> - We do not own any of the text from which this data has been extracted. - We license the actual packaging of the mined data under the [Creative Commons CC0 license (“no rights reserved”)](http://creativecommons.org/publicdomain/zero/1.0). - To the extent possible under law, <a rel="dct:publisher" href="https://ai4bharat.iitm.ac.in/"> <span property="dct:title">AI4Bharat</span></a> has waived all copyright and related or neighboring rights to <span property="dct:title">Naamapadam</span> manually collected data and existing sources. - This work is published from: India. ### Citation Information If you are using the Naampadam corpus, please cite the following article: ``` @misc{mhaske2022naamapadam, doi = {10.48550/ARXIV.2212.10168}, url = {https://arxiv.org/abs/2212.10168}, author = {Mhaske, Arnav and Kedia, Harshit and Doddapaneni, Sumanth and Khapra, Mitesh M. and Kumar, Pratyush and Murthy, Rudra and Kunchukuttan, Anoop}, title = {Naamapadam: A Large-Scale Named Entity Annotated Data for Indic Languages} publisher = {arXiv}, year = {2022}, } ``` <!-- Contributors --> ### Contributors - Arnav Mhaske <sub> ([AI4Bharat](https://ai4bharat.org), [IITM](https://www.iitm.ac.in)) </sub> - Harshit Kedia <sub> ([AI4Bharat](https://ai4bharat.org), [IITM](https://www.iitm.ac.in)) </sub> - Sumanth Doddapaneni <sub> ([AI4Bharat](https://ai4bharat.org), [IITM](https://www.iitm.ac.in)) </sub> - Mitesh M. Khapra <sub> ([AI4Bharat](https://ai4bharat.org), [IITM](https://www.iitm.ac.in)) </sub> - Pratyush Kumar <sub> ([AI4Bharat](https://ai4bharat.org), [Microsoft](https://www.microsoft.com/en-in/), [IITM](https://www.iitm.ac.in)) </sub> - Rudra Murthy <sub> ([AI4Bharat](https://ai4bharat.org), [IBM](https://www.ibm.com))</sub> - Anoop Kunchukuttan <sub> ([AI4Bharat](https://ai4bharat.org), [Microsoft](https://www.microsoft.com/en-in/), [IITM](https://www.iitm.ac.in)) </sub> This work is the outcome of a volunteer effort as part of the [AI4Bharat initiative](https://ai4bharat.iitm.ac.in). <!-- Contact --> ### Contact - Anoop Kunchukuttan ([anoop.kunchukuttan@gmail.com](mailto:anoop.kunchukuttan@gmail.com)) - Rudra Murthy V ([rmurthyv@in.ibm.com](mailto:rmurthyv@in.ibm.com))
Anonx7/ANALOGDIFFUSIONV1
--- license: openrail ---
Fahad-7864/autotrain-data-trial
--- task_categories: - image-classification --- # AutoTrain Dataset for project: trial ## Dataset Description This dataset has been automatically processed by AutoTrain for project trial. ### Languages The BCP-47 code for the dataset's language is unk. ## Dataset Structure ### Data Instances A sample from this dataset looks as follows: ```json [ { "image": "<32x36 RGBA PIL image>", "target": 0 }, { "image": "<32x36 RGBA PIL image>", "target": 2 } ] ``` ### Dataset Fields The dataset has the following fields (also called "features"): ```json { "image": "Image(decode=True, id=None)", "target": "ClassLabel(names=['Healer_f', 'healer_m', 'ninja_m', 'ranger_m', 'rpgsprites1'], id=None)" } ``` ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow: | Split name | Num samples | | ------------ | ------------------- | | train | 45 | | valid | 15 |
LambdaTests/VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_24_10000000
--- dataset_info: features: - name: id dtype: int64 - name: response dtype: string splits: - name: train num_bytes: 193096 num_examples: 6699 download_size: 124294 dataset_size: 193096 --- # Dataset Card for "VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_24_10000000" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
tasksource/esci
--- dataset_info: features: - name: example_id dtype: int64 - name: query dtype: string - name: query_id dtype: int64 - name: product_id dtype: string - name: product_locale dtype: string - name: esci_label dtype: string - name: small_version dtype: int64 - name: large_version dtype: int64 - name: product_title dtype: string - name: product_description dtype: string - name: product_bullet_point dtype: string - name: product_brand dtype: string - name: product_color dtype: string - name: product_text dtype: string splits: - name: train num_bytes: 5047037946 num_examples: 2027874 - name: test num_bytes: 1631847321 num_examples: 652490 download_size: 2517788457 dataset_size: 6678885267 license: apache-2.0 task_categories: - text-classification - text-retrieval language: - en - ja - es --- # Dataset Card for "esci" ESCI product search dataset https://github.com/amazon-science/esci-data/ Preprocessings: -joined the two relevant files -product_text aggregate all product text -mapped esci_label to full name ```bib @article{reddy2022shopping, title={Shopping Queries Dataset: A Large-Scale {ESCI} Benchmark for Improving Product Search}, author={Chandan K. Reddy and Lluís Màrquez and Fran Valero and Nikhil Rao and Hugo Zaragoza and Sambaran Bandyopadhyay and Arnab Biswas and Anlu Xing and Karthik Subbian}, year={2022}, eprint={2206.06588}, archivePrefix={arXiv} } ```
SINAI/SOL
--- license: cc-by-nc-sa-4.0 language: - es tags: - opinion words pretty_name: SOL configs: - config_name: default data_files: - split: positive path: solData/sol_positivas.csv - split: negative path: solData/sol_negativas.csv --- ### Dataset Description **Paper**: [Bilingual Experiments on an Opinion Comparable Corpus](https://aclanthology.org/W13-1612.pdf) **Point of Contact**: emcamara@ujaen.es For the elaboration of the resource it has begun with the list of words that maintains the professor Bing Liu(Bing Liu’s Opinion Lexicon). The list of words has been automatically translated using the translator Reverso. The list consists of 1,397 positive and 3,151 negative words. For more information on how the list was developed see the article: Bilingual Experiments on an Opinion Comparable Corpus (in press). ### Licensing Information SOL is released under the [Apache-2.0 License](http://www.apache.org/licenses/LICENSE-2.0). ### Citation Information ```bibtex @inproceedings{martinez-camara-etal-2013-bilingual, title = "Bilingual Experiments on an Opinion Comparable Corpus", author = "Mart{\'\i}nez-C{\'a}mara, Eugenio and Mart{\'\i}n-Valdivia, M. Teresa and Molina-Gonz{\'a}lez, M. Dolores and Ure{\~n}a-L{\'o}pez, L. Alfonso", editor = "Balahur, Alexandra and van der Goot, Erik and Montoyo, Andres", booktitle = "Proceedings of the 4th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis", month = jun, year = "2013", address = "Atlanta, Georgia", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/W13-1612", pages = "87--93", } ```
meccanicamente/text_generic_LLAMA2_Oracolo_10
--- license: mit ---
scribis/Wikipedia-it-Trame-di-Film
--- license: apache-2.0 language: - it tags: - wikipedia - movie plots --- Raccolta di trame di film storici e film di avventura da Wikipedia italiana
fathyshalab/germanquad_qaeval_qaeval_dataset
--- license: cc-by-sa-4.0 task_categories: - text-classification language: - de - en size_categories: - 10K<n<100K ---
chhipakartikey/ayurved_digestion
--- dataset_info: features: - name: human dtype: string - name: ai dtype: string splits: - name: train num_bytes: 44546 num_examples: 141 - name: test num_bytes: 11535 num_examples: 36 download_size: 21265 dataset_size: 56081 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
hivaze/ru-AAQG-QA-QG
--- dataset_info: features: - name: task_type dtype: string - name: instruction dtype: string - name: target dtype: string - name: source dtype: string splits: - name: train num_bytes: 93075676 num_examples: 53264 - name: validation num_bytes: 12239164 num_examples: 6850 download_size: 45289649 dataset_size: 105314840 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* language: - ru tags: - qa - qg - aaqg - instruct - question-answering - question-generation size_categories: - 10K<n<100K task_categories: - question-answering - text2text-generation --- ### Description This is a dataset created for training Russian-language Seq2Seq and CLM models primarily for tasks related to Closed-Domain QA. The dataset includes 3 main tasks: 1. AAQG (Answer-Aware Question Answering) - generation of questions based on context, provided the answer is known 2. QG - generating questions based on context, without a known answer 3. QA - the standard task of answering a question based on context. AAQG, QG, QA tasks are generated based on regular datasets for which the context, question and correct answer are known. They are generated in a ratio of 0.4, 0.3 and 0.3, respectively. List of datasets used to compile this dataset: 1. sberquad 2. russian_super_glue/muserc 3. russian_super_glue/danetqa Prompts used for QA tasks: ```python AAQG_PROMPT = "Сгенерируй вопрос по тексту, используя известный ответ. Текст: '{context}'. Ответ: '{answer}'." QG_PROMPT = "Сгенерируй вопрос по тексту. Текст: '{context}'." QA_PROMPT = "Сгенерируй ответ на вопрос по тексту. Текст: '{context}'. Вопрос: '{question}'." ``` ### Authors - Sergei Bratchikov (https://t.me/nlpwanderer)
hemachandher/new_dataset
--- dataset_info: features: - name: image struct: - name: bytes dtype: binary - name: path dtype: 'null' - name: text dtype: string splits: - name: train num_bytes: 138098403 num_examples: 1001 download_size: 100680724 dataset_size: 138098403 configs: - config_name: default data_files: - split: train path: data/train-* ---
Indramal/SINHALA_QUESTION_AND_ANSWER_DATASET
--- license: apache-2.0 task_categories: - question-answering language: - si tags: - sinhala - general - question&answering --- # **SINHALA QUESTION AND ANSWER DATASET by Indramal** > **Contact details:** [Indramal Wansekara Profile Website](https://www.indramal.com/)
evi1m0/ttest
--- license: openrail ---
distilled-from-one-sec-cv12/chunk_142
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 1078931152 num_examples: 210236 download_size: 1103387528 dataset_size: 1078931152 --- # Dataset Card for "chunk_142" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
lucasroberto125/Lessa
--- license: artistic-2.0 ---
open-llm-leaderboard/details_TheBloke__Kimiko-13B-fp16
--- pretty_name: Evaluation run of TheBloke/Kimiko-13B-fp16 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [TheBloke/Kimiko-13B-fp16](https://huggingface.co/TheBloke/Kimiko-13B-fp16) on\ \ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 3 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Kimiko-13B-fp16\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-10-22T20:29:03.807457](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Kimiko-13B-fp16/blob/main/results_2023-10-22T20-29-03.807457.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0017827181208053692,\n\ \ \"em_stderr\": 0.00043200973460388425,\n \"f1\": 0.06370176174496635,\n\ \ \"f1_stderr\": 0.0013821226935642709,\n \"acc\": 0.42755597415707414,\n\ \ \"acc_stderr\": 0.009839681635672129\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.0017827181208053692,\n \"em_stderr\": 0.00043200973460388425,\n\ \ \"f1\": 0.06370176174496635,\n \"f1_stderr\": 0.0013821226935642709\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08794541319181198,\n \ \ \"acc_stderr\": 0.007801162197487721\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.7671665351223362,\n \"acc_stderr\": 0.011878201073856539\n\ \ }\n}\n```" repo_url: https://huggingface.co/TheBloke/Kimiko-13B-fp16 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_drop_3 data_files: - split: 2023_10_22T20_29_03.807457 path: - '**/details_harness|drop|3_2023-10-22T20-29-03.807457.parquet' - split: latest path: - '**/details_harness|drop|3_2023-10-22T20-29-03.807457.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_10_22T20_29_03.807457 path: - '**/details_harness|gsm8k|5_2023-10-22T20-29-03.807457.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-10-22T20-29-03.807457.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_10_22T20_29_03.807457 path: - '**/details_harness|winogrande|5_2023-10-22T20-29-03.807457.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-10-22T20-29-03.807457.parquet' - config_name: results data_files: - split: 2023_10_22T20_29_03.807457 path: - results_2023-10-22T20-29-03.807457.parquet - split: latest path: - results_2023-10-22T20-29-03.807457.parquet --- # Dataset Card for Evaluation run of TheBloke/Kimiko-13B-fp16 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/TheBloke/Kimiko-13B-fp16 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [TheBloke/Kimiko-13B-fp16](https://huggingface.co/TheBloke/Kimiko-13B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_TheBloke__Kimiko-13B-fp16", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-10-22T20:29:03.807457](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Kimiko-13B-fp16/blob/main/results_2023-10-22T20-29-03.807457.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.0017827181208053692, "em_stderr": 0.00043200973460388425, "f1": 0.06370176174496635, "f1_stderr": 0.0013821226935642709, "acc": 0.42755597415707414, "acc_stderr": 0.009839681635672129 }, "harness|drop|3": { "em": 0.0017827181208053692, "em_stderr": 0.00043200973460388425, "f1": 0.06370176174496635, "f1_stderr": 0.0013821226935642709 }, "harness|gsm8k|5": { "acc": 0.08794541319181198, "acc_stderr": 0.007801162197487721 }, "harness|winogrande|5": { "acc": 0.7671665351223362, "acc_stderr": 0.011878201073856539 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
kalyan003/Evidence_QA
--- license: unknown ---
dddb/autotrain-data-mt5_chinese_small_finetune
--- task_categories: - conditional-text-generation --- # AutoTrain Dataset for project: mt5_chinese_small_finetune ## Dataset Descritpion This dataset has been automatically processed by AutoTrain for project mt5_chinese_small_finetune. ### Languages The BCP-47 code for the dataset's language is unk. ## Dataset Structure ### Data Instances A sample from this dataset looks as follows: ```json [ { "text": "\u8fd1\u671f\uff0c\u7f8e\u56fd\u56fd\u4f1a\u4f17\u9662\u901a\u8fc7\u6cd5\u6848\uff0c\u91cd\u7533\u7f8e\u56fd\u5bf9\u53f0\u6e7e\u7684\u627f\u8bfa\u3002\u5bf9\u6b64\uff0c\u4e2d\u56fd\u5916\u4ea4\u90e8\u53d1\u8a00\u4eba\u8868\u793a\uff0c\u6709\u5173\u6cd5\u6848\u4e25\u91cd\u8fdd\u53cd\u4e00\u4e2a\u4e2d\u56fd\u539f\u5219\u548c\u4e2d\u7f8e\u4e09\u4e2a\u8054\u5408\u516c\u62a5\u89c4\u5b9a\uff0c\u7c97\u66b4\u5e72\u6d89\u4e2d\u56fd\u5185\u653f\uff0c\u4e2d\u65b9\u5bf9\u6b64\u575a\u51b3\u53cd\u5bf9\u5e76\u5df2\u5411\u7f8e\u65b9\u63d0\u51fa\u4e25\u6b63\u4ea4\u6d89\u3002\n\u4e8b\u5b9e\u4e0a\uff0c\u4e2d[...]", "target": "\u671b\u6d77\u697c\u7f8e\u56fd\u6253\u201c\u53f0\u6e7e\u724c\u201d\u662f\u5371\u9669\u7684\u8d4c\u535a" }, { "text": "\u5728\u63a8\u8fdb\u201c\u53cc\u4e00\u6d41\u201d\u9ad8\u6821\u5efa\u8bbe\u8fdb\u7a0b\u4e2d\uff0c\u6211\u4eec\u8981\u7d27\u7d27\u56f4\u7ed5\u4e3a\u515a\u80b2\u4eba\u3001\u4e3a\u56fd\u80b2\u624d\uff0c\u627e\u51c6\u95ee\u9898\u3001\u7834\u89e3\u96be\u9898\uff0c\u4ee5\u4e00\u6d41\u610f\u8bc6\u548c\u62c5\u5f53\u7cbe\u795e\uff0c\u5927\u529b\u63a8\u8fdb\u9ad8\u6821\u7684\u6cbb\u7406\u80fd\u529b\u5efa\u8bbe\u3002\n\u589e\u5f3a\u653f\u6cbb\u5f15\u9886\u529b\u3002\u575a\u6301\u515a\u5bf9\u9ad8\u6821\u5de5\u4f5c\u7684\u5168\u9762\u9886\u5bfc\uff0c\u59cb\u7ec8\u628a\u653f\u6cbb\u5efa\u8bbe\u6446\u5728[...]", "target": "\u5927\u529b\u63a8\u8fdb\u9ad8\u6821\u6cbb\u7406\u80fd\u529b\u5efa\u8bbe" } ] ``` ### Dataset Fields The dataset has the following fields (also called "features"): ```json { "text": "Value(dtype='string', id=None)", "target": "Value(dtype='string', id=None)" } ``` ### Dataset Splits This dataset is split into a train and validation split. The split sizes are as follow: | Split name | Num samples | | ------------ | ------------------- | | train | 5850 | | valid | 1679 |
zhengr/alpaca-chinese-dataset
--- license: apache-2.0 ---
chuyin0321/news-stocks
--- dataset_info: features: - name: symbol dtype: string - name: body dtype: string - name: publisher dtype: string - name: publish_time dtype: timestamp[ns, tz=GMT] - name: title dtype: string - name: url dtype: string - name: uuid dtype: string splits: - name: train num_bytes: 18435776 num_examples: 2978 download_size: 9157427 dataset_size: 18435776 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "news-stocks" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
AdapterOcean/med_alpaca_standardized_cluster_16_std
--- dataset_info: features: - name: message dtype: string - name: message_type dtype: string - name: message_id dtype: int64 - name: conversation_id dtype: int64 - name: cluster dtype: float64 - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 10849591 num_examples: 22229 download_size: 5657691 dataset_size: 10849591 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "med_alpaca_standardized_cluster_16_std" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
sethapun/arithmetic_2md_1to10
--- dataset_info: features: - name: expression dtype: string - name: answer dtype: float64 - name: label dtype: class_label: names: '0': 'false' '1': 'true' splits: - name: train num_bytes: 54660 num_examples: 2000 - name: validation num_bytes: 10960 num_examples: 400 download_size: 13336 dataset_size: 65620 --- # Dataset Card for "arithmetic_2md_1to10" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
kreimben/leetcode_youtube_cc
--- license: mit dataset_info: features: - name: video_id dtype: string - name: content dtype: string - name: datetime dtype: string - name: leetcode_number dtype: int64 - name: thumbnail dtype: string - name: title dtype: string splits: - name: train num_bytes: 245939427 num_examples: 18621 download_size: 88300478 dataset_size: 245939427 configs: - config_name: default data_files: - split: train path: data/train-* ---
seungheondoh/music-wiki
--- license: mit language: - en tags: - music - wiki size_categories: - 100K<n<1M --- # Dataset Card for "music-wiki" 📚🎵 Introducing **music-wiki** 📊🎶 Our data collection process unfolds as follows: 1) Starting with a seed page from Wikipedia's music section, we navigate through a referenced page graph, employing recursive crawling up to a depth of 20 levels. 2) Simultaneously, tapping into the rich MusicBrainz dump, we encounter a staggering 11 million unique music entities spanning 10 distinct categories. These entities serve as the foundation for utilizing the Wikipedia API to meticulously crawl corresponding pages. The culmination of these efforts results in the assembly of data: 167k pages from the first method and an additional 193k pages through the second method. While totaling at 361k pages, this compilation provides a substantial groundwork for establishing a Music-Text-Database. 🎵📚🔍 - **Repository:** [music-wiki](https://github.com/seungheondoh/music-wiki) [![image](https://i.imgur.com/NJrjVyy.png)](https://github.com/seungheondoh/music-wiki) ### splits - wikipedia_music: 167890 - musicbrainz_genre: 1459 - musicbrainz_instrument: 872 - musicbrainz_artist: 7002 - musicbrainz_release: 163068 - musicbrainz_release_group: 15942 - musicbrainz_label: 158 - musicbrainz_work: 4282 - musicbrainz_series: 12 - musicbrainz_place: 49 - musicbrainz_event: 16 - musicbrainz_area: 360
prashanthpillai/docvqa_1000_examples
--- dataset_info: features: - name: id dtype: string - name: image dtype: image - name: query struct: - name: de dtype: string - name: en dtype: string - name: es dtype: string - name: fr dtype: string - name: it dtype: string - name: answers sequence: string - name: words sequence: string - name: bounding_boxes sequence: sequence: float32 length: 4 - name: answer struct: - name: match_score dtype: float64 - name: matched_text dtype: string - name: start dtype: int64 - name: text dtype: string splits: - name: train num_bytes: 381430038.0 num_examples: 1000 - name: test num_bytes: 70769538.0 num_examples: 200 download_size: 148367971 dataset_size: 452199576.0 --- # Dataset Card for "docvqa_1000_examples" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
salomonsky/datos
--- license: mit ---
open-llm-leaderboard/details_chihoonlee10__T3Q-MSlerp-13B
--- pretty_name: Evaluation run of chihoonlee10/T3Q-MSlerp-13B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [chihoonlee10/T3Q-MSlerp-13B](https://huggingface.co/chihoonlee10/T3Q-MSlerp-13B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chihoonlee10__T3Q-MSlerp-13B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-03-14T03:48:04.846464](https://huggingface.co/datasets/open-llm-leaderboard/details_chihoonlee10__T3Q-MSlerp-13B/blob/main/results_2024-03-14T03-48-04.846464.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26123911772691083,\n\ \ \"acc_stderr\": 0.030746683415945546,\n \"acc_norm\": 0.26272288267527566,\n\ \ \"acc_norm_stderr\": 0.031569242870413836,\n \"mc1\": 0.23133414932680538,\n\ \ \"mc1_stderr\": 0.014761945174862661,\n \"mc2\": 0.48011742832768245,\n\ \ \"mc2_stderr\": 0.01646970337462705\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.21416382252559726,\n \"acc_stderr\": 0.011988383205966499,\n\ \ \"acc_norm\": 0.2764505119453925,\n \"acc_norm_stderr\": 0.013069662474252428\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25692093208524197,\n\ \ \"acc_stderr\": 0.004360424536145122,\n \"acc_norm\": 0.25851424019119695,\n\ \ \"acc_norm_stderr\": 0.004369232540125879\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\ \ \"acc_stderr\": 0.03633384414073461,\n \"acc_norm\": 0.22962962962962963,\n\ \ \"acc_norm_stderr\": 0.03633384414073461\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.3355263157894737,\n \"acc_stderr\": 0.03842498559395268,\n\ \ \"acc_norm\": 0.3355263157894737,\n \"acc_norm_stderr\": 0.03842498559395268\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\ \ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \ \ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.028152837942493857,\n\ \ \"acc_norm\": 0.2981132075471698,\n \"acc_norm_stderr\": 0.028152837942493857\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\ \ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\ \ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\"\ : 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \ \ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.32947976878612717,\n\ \ \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.32947976878612717,\n\ \ \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082633,\n\ \ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082633\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n\ \ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.20851063829787234,\n \"acc_stderr\": 0.026556982117838728,\n\ \ \"acc_norm\": 0.20851063829787234,\n \"acc_norm_stderr\": 0.026556982117838728\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\ \ \"acc_stderr\": 0.039994238792813344,\n \"acc_norm\": 0.23684210526315788,\n\ \ \"acc_norm_stderr\": 0.039994238792813344\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.034559302019248096,\n\ \ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.034559302019248096\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776564,\n \"\ acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776564\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\ \ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\ \ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \ \ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n\ \ \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n\ \ \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.03161856335358609,\n\ \ \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.03161856335358609\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\ : 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n\ \ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"\ acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.36787564766839376,\n \"acc_stderr\": 0.03480175668466036,\n\ \ \"acc_norm\": 0.36787564766839376,\n \"acc_norm_stderr\": 0.03480175668466036\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.3641025641025641,\n \"acc_stderr\": 0.02439667298509477,\n \ \ \"acc_norm\": 0.3641025641025641,\n \"acc_norm_stderr\": 0.02439667298509477\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.23703703703703705,\n \"acc_stderr\": 0.02592887613276612,\n \ \ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.02592887613276612\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.3487394957983193,\n \"acc_stderr\": 0.03095663632856655,\n \ \ \"acc_norm\": 0.3487394957983193,\n \"acc_norm_stderr\": 0.03095663632856655\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\ acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.3486238532110092,\n \"acc_stderr\": 0.020431254090714328,\n \"\ acc_norm\": 0.3486238532110092,\n \"acc_norm_stderr\": 0.020431254090714328\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.1574074074074074,\n \"acc_stderr\": 0.02483717351824239,\n \"\ acc_norm\": 0.1574074074074074,\n \"acc_norm_stderr\": 0.02483717351824239\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693254,\n \"\ acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693254\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.20253164556962025,\n \"acc_stderr\": 0.026160568246601457,\n \ \ \"acc_norm\": 0.20253164556962025,\n \"acc_norm_stderr\": 0.026160568246601457\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.10762331838565023,\n\ \ \"acc_stderr\": 0.020799400082879997,\n \"acc_norm\": 0.10762331838565023,\n\ \ \"acc_norm_stderr\": 0.020799400082879997\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\ \ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.14049586776859505,\n \"acc_stderr\": 0.03172233426002161,\n \"\ acc_norm\": 0.14049586776859505,\n \"acc_norm_stderr\": 0.03172233426002161\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\ \ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\ \ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.033220157957767414,\n\ \ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.033220157957767414\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16071428571428573,\n\ \ \"acc_stderr\": 0.03485946096475741,\n \"acc_norm\": 0.16071428571428573,\n\ \ \"acc_norm_stderr\": 0.03485946096475741\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258972,\n\ \ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258972\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\ \ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\ \ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \ \ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.20434227330779056,\n\ \ \"acc_stderr\": 0.0144191239809319,\n \"acc_norm\": 0.20434227330779056,\n\ \ \"acc_norm_stderr\": 0.0144191239809319\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757183,\n\ \ \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757183\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\ \ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\ \ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.02609016250427905,\n\ \ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.02609016250427905\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24115755627009647,\n\ \ \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.24115755627009647,\n\ \ \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.023246202647819746,\n\ \ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.023246202647819746\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.24113475177304963,\n \"acc_stderr\": 0.025518731049537762,\n \ \ \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.025518731049537762\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24445893089960888,\n\ \ \"acc_stderr\": 0.010976425013113886,\n \"acc_norm\": 0.24445893089960888,\n\ \ \"acc_norm_stderr\": 0.010976425013113886\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\ \ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.2173202614379085,\n \"acc_stderr\": 0.01668482092914859,\n \ \ \"acc_norm\": 0.2173202614379085,\n \"acc_norm_stderr\": 0.01668482092914859\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\ \ \"acc_stderr\": 0.04013964554072774,\n \"acc_norm\": 0.22727272727272727,\n\ \ \"acc_norm_stderr\": 0.04013964554072774\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \ \ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n \ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.26865671641791045,\n\ \ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.26865671641791045,\n\ \ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \ \ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\ \ \"acc_stderr\": 0.0362933532994786,\n \"acc_norm\": 0.3192771084337349,\n\ \ \"acc_norm_stderr\": 0.0362933532994786\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.17543859649122806,\n \"acc_stderr\": 0.029170885500727654,\n\ \ \"acc_norm\": 0.17543859649122806,\n \"acc_norm_stderr\": 0.029170885500727654\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23133414932680538,\n\ \ \"mc1_stderr\": 0.014761945174862661,\n \"mc2\": 0.48011742832768245,\n\ \ \"mc2_stderr\": 0.01646970337462705\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.4988161010260458,\n \"acc_stderr\": 0.014052446290529022\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\ : 0.0\n }\n}\n```" repo_url: https://huggingface.co/chihoonlee10/T3Q-MSlerp-13B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|arc:challenge|25_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-03-14T03-48-04.846464.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|gsm8k|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hellaswag|10_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-management|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-03-14T03-48-04.846464.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-management|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-virology|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T03-48-04.846464.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|truthfulqa:mc|0_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-03-14T03-48-04.846464.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_03_14T03_48_04.846464 path: - '**/details_harness|winogrande|5_2024-03-14T03-48-04.846464.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-03-14T03-48-04.846464.parquet' - config_name: results data_files: - split: 2024_03_14T03_48_04.846464 path: - results_2024-03-14T03-48-04.846464.parquet - split: latest path: - results_2024-03-14T03-48-04.846464.parquet --- # Dataset Card for Evaluation run of chihoonlee10/T3Q-MSlerp-13B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [chihoonlee10/T3Q-MSlerp-13B](https://huggingface.co/chihoonlee10/T3Q-MSlerp-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_chihoonlee10__T3Q-MSlerp-13B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-03-14T03:48:04.846464](https://huggingface.co/datasets/open-llm-leaderboard/details_chihoonlee10__T3Q-MSlerp-13B/blob/main/results_2024-03-14T03-48-04.846464.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.26123911772691083, "acc_stderr": 0.030746683415945546, "acc_norm": 0.26272288267527566, "acc_norm_stderr": 0.031569242870413836, "mc1": 0.23133414932680538, "mc1_stderr": 0.014761945174862661, "mc2": 0.48011742832768245, "mc2_stderr": 0.01646970337462705 }, "harness|arc:challenge|25": { "acc": 0.21416382252559726, "acc_stderr": 0.011988383205966499, "acc_norm": 0.2764505119453925, "acc_norm_stderr": 0.013069662474252428 }, "harness|hellaswag|10": { "acc": 0.25692093208524197, "acc_stderr": 0.004360424536145122, "acc_norm": 0.25851424019119695, "acc_norm_stderr": 0.004369232540125879 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.22962962962962963, "acc_stderr": 0.03633384414073461, "acc_norm": 0.22962962962962963, "acc_norm_stderr": 0.03633384414073461 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.3355263157894737, "acc_stderr": 0.03842498559395268, "acc_norm": 0.3355263157894737, "acc_norm_stderr": 0.03842498559395268 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.2981132075471698, "acc_stderr": 0.028152837942493857, "acc_norm": 0.2981132075471698, "acc_norm_stderr": 0.028152837942493857 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.2638888888888889, "acc_stderr": 0.03685651095897532, "acc_norm": 0.2638888888888889, "acc_norm_stderr": 0.03685651095897532 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.33, "acc_stderr": 0.04725815626252604, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252604 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.31, "acc_stderr": 0.04648231987117316, "acc_norm": 0.31, "acc_norm_stderr": 0.04648231987117316 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.32947976878612717, "acc_stderr": 0.03583901754736411, "acc_norm": 0.32947976878612717, "acc_norm_stderr": 0.03583901754736411 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.37254901960784315, "acc_stderr": 0.04810840148082633, "acc_norm": 0.37254901960784315, "acc_norm_stderr": 0.04810840148082633 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.18, "acc_stderr": 0.038612291966536955, "acc_norm": 0.18, "acc_norm_stderr": 0.038612291966536955 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.20851063829787234, "acc_stderr": 0.026556982117838728, "acc_norm": 0.20851063829787234, "acc_norm_stderr": 0.026556982117838728 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.23684210526315788, "acc_stderr": 0.039994238792813344, "acc_norm": 0.23684210526315788, "acc_norm_stderr": 0.039994238792813344 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.2206896551724138, "acc_stderr": 0.034559302019248096, "acc_norm": 0.2206896551724138, "acc_norm_stderr": 0.034559302019248096 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.2671957671957672, "acc_stderr": 0.022789673145776564, "acc_norm": 0.2671957671957672, "acc_norm_stderr": 0.022789673145776564 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.36507936507936506, "acc_stderr": 0.04306241259127153, "acc_norm": 0.36507936507936506, "acc_norm_stderr": 0.04306241259127153 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.18, "acc_stderr": 0.03861229196653694, "acc_norm": 0.18, "acc_norm_stderr": 0.03861229196653694 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.3161290322580645, "acc_stderr": 0.02645087448904277, "acc_norm": 0.3161290322580645, "acc_norm_stderr": 0.02645087448904277 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.28078817733990147, "acc_stderr": 0.03161856335358609, "acc_norm": 0.28078817733990147, "acc_norm_stderr": 0.03161856335358609 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.21, "acc_stderr": 0.040936018074033256, "acc_norm": 0.21, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.2545454545454545, "acc_stderr": 0.03401506715249039, "acc_norm": 0.2545454545454545, "acc_norm_stderr": 0.03401506715249039 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.35353535353535354, "acc_stderr": 0.03406086723547153, "acc_norm": 0.35353535353535354, "acc_norm_stderr": 0.03406086723547153 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.36787564766839376, "acc_stderr": 0.03480175668466036, "acc_norm": 0.36787564766839376, "acc_norm_stderr": 0.03480175668466036 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.3641025641025641, "acc_stderr": 0.02439667298509477, "acc_norm": 0.3641025641025641, "acc_norm_stderr": 0.02439667298509477 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.23703703703703705, "acc_stderr": 0.02592887613276612, "acc_norm": 0.23703703703703705, "acc_norm_stderr": 0.02592887613276612 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.3487394957983193, "acc_stderr": 0.03095663632856655, "acc_norm": 0.3487394957983193, "acc_norm_stderr": 0.03095663632856655 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.33112582781456956, "acc_stderr": 0.038425817186598696, "acc_norm": 0.33112582781456956, "acc_norm_stderr": 0.038425817186598696 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.3486238532110092, "acc_stderr": 0.020431254090714328, "acc_norm": 0.3486238532110092, "acc_norm_stderr": 0.020431254090714328 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.1574074074074074, "acc_stderr": 0.02483717351824239, "acc_norm": 0.1574074074074074, "acc_norm_stderr": 0.02483717351824239 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.25980392156862747, "acc_stderr": 0.030778554678693254, "acc_norm": 0.25980392156862747, "acc_norm_stderr": 0.030778554678693254 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.20253164556962025, "acc_stderr": 0.026160568246601457, "acc_norm": 0.20253164556962025, "acc_norm_stderr": 0.026160568246601457 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.10762331838565023, "acc_stderr": 0.020799400082879997, "acc_norm": 0.10762331838565023, "acc_norm_stderr": 0.020799400082879997 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.2824427480916031, "acc_stderr": 0.03948406125768361, "acc_norm": 0.2824427480916031, "acc_norm_stderr": 0.03948406125768361 }, "harness|hendrycksTest-international_law|5": { "acc": 0.14049586776859505, "acc_stderr": 0.03172233426002161, "acc_norm": 0.14049586776859505, "acc_norm_stderr": 0.03172233426002161 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.21296296296296297, "acc_stderr": 0.0395783547198098, "acc_norm": 0.21296296296296297, "acc_norm_stderr": 0.0395783547198098 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.2331288343558282, "acc_stderr": 0.033220157957767414, "acc_norm": 0.2331288343558282, "acc_norm_stderr": 0.033220157957767414 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.16071428571428573, "acc_stderr": 0.03485946096475741, "acc_norm": 0.16071428571428573, "acc_norm_stderr": 0.03485946096475741 }, "harness|hendrycksTest-management|5": { "acc": 0.3786407766990291, "acc_stderr": 0.04802694698258972, "acc_norm": 0.3786407766990291, "acc_norm_stderr": 0.04802694698258972 }, "harness|hendrycksTest-marketing|5": { "acc": 0.19658119658119658, "acc_stderr": 0.02603538609895129, "acc_norm": 0.19658119658119658, "acc_norm_stderr": 0.02603538609895129 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.24, "acc_stderr": 0.04292346959909281, "acc_norm": 0.24, "acc_norm_stderr": 0.04292346959909281 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.20434227330779056, "acc_stderr": 0.0144191239809319, "acc_norm": 0.20434227330779056, "acc_norm_stderr": 0.0144191239809319 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.2138728323699422, "acc_stderr": 0.022075709251757183, "acc_norm": 0.2138728323699422, "acc_norm_stderr": 0.022075709251757183 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.2424581005586592, "acc_stderr": 0.014333522059217889, "acc_norm": 0.2424581005586592, "acc_norm_stderr": 0.014333522059217889 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.29411764705882354, "acc_stderr": 0.02609016250427905, "acc_norm": 0.29411764705882354, "acc_norm_stderr": 0.02609016250427905 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.24115755627009647, "acc_stderr": 0.024296594034763426, "acc_norm": 0.24115755627009647, "acc_norm_stderr": 0.024296594034763426 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.22530864197530864, "acc_stderr": 0.023246202647819746, "acc_norm": 0.22530864197530864, "acc_norm_stderr": 0.023246202647819746 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.24113475177304963, "acc_stderr": 0.025518731049537762, "acc_norm": 0.24113475177304963, "acc_norm_stderr": 0.025518731049537762 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.24445893089960888, "acc_stderr": 0.010976425013113886, "acc_norm": 0.24445893089960888, "acc_norm_stderr": 0.010976425013113886 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.4485294117647059, "acc_stderr": 0.030211479609121593, "acc_norm": 0.4485294117647059, "acc_norm_stderr": 0.030211479609121593 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.2173202614379085, "acc_stderr": 0.01668482092914859, "acc_norm": 0.2173202614379085, "acc_norm_stderr": 0.01668482092914859 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.22727272727272727, "acc_stderr": 0.04013964554072774, "acc_norm": 0.22727272727272727, "acc_norm_stderr": 0.04013964554072774 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.4, "acc_stderr": 0.031362502409358936, "acc_norm": 0.4, "acc_norm_stderr": 0.031362502409358936 }, "harness|hendrycksTest-sociology|5": { "acc": 0.26865671641791045, "acc_stderr": 0.03134328358208954, "acc_norm": 0.26865671641791045, "acc_norm_stderr": 0.03134328358208954 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.26, "acc_stderr": 0.04408440022768078, "acc_norm": 0.26, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-virology|5": { "acc": 0.3192771084337349, "acc_stderr": 0.0362933532994786, "acc_norm": 0.3192771084337349, "acc_norm_stderr": 0.0362933532994786 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.17543859649122806, "acc_stderr": 0.029170885500727654, "acc_norm": 0.17543859649122806, "acc_norm_stderr": 0.029170885500727654 }, "harness|truthfulqa:mc|0": { "mc1": 0.23133414932680538, "mc1_stderr": 0.014761945174862661, "mc2": 0.48011742832768245, "mc2_stderr": 0.01646970337462705 }, "harness|winogrande|5": { "acc": 0.4988161010260458, "acc_stderr": 0.014052446290529022 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
GHOFRANEE/Dataset_ocr_to_json
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* dataset_info: features: - name: instruction dtype: string - name: input dtype: string - name: output dtype: string splits: - name: train num_bytes: 12423052 num_examples: 2138 - name: validation num_bytes: 638444 num_examples: 70 download_size: 5759879 dataset_size: 13061496 --- # Dataset Card for "Dataset_ocr_to_json" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_LoupGarou__WizardCoder-Guanaco-15B-V1.0
--- pretty_name: Evaluation run of LoupGarou/WizardCoder-Guanaco-15B-V1.0 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [LoupGarou/WizardCoder-Guanaco-15B-V1.0](https://huggingface.co/LoupGarou/WizardCoder-Guanaco-15B-V1.0)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 64 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the agregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_LoupGarou__WizardCoder-Guanaco-15B-V1.0\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-09-23T04:03:10.692358](https://huggingface.co/datasets/open-llm-leaderboard/details_LoupGarou__WizardCoder-Guanaco-15B-V1.0/blob/main/results_2023-09-23T04-03-10.692358.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.04089765100671141,\n\ \ \"em_stderr\": 0.0020282491887764946,\n \"f1\": 0.08708682885906038,\n\ \ \"f1_stderr\": 0.002301893268858503,\n \"acc\": 0.27279042923742786,\n\ \ \"acc_stderr\": 0.008653599278888232\n },\n \"harness|drop|3\": {\n\ \ \"em\": 0.04089765100671141,\n \"em_stderr\": 0.0020282491887764946,\n\ \ \"f1\": 0.08708682885906038,\n \"f1_stderr\": 0.002301893268858503\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.014404852160727824,\n \ \ \"acc_stderr\": 0.0032820559171369513\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.5311760063141279,\n \"acc_stderr\": 0.014025142640639511\n\ \ }\n}\n```" repo_url: https://huggingface.co/LoupGarou/WizardCoder-Guanaco-15B-V1.0 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|arc:challenge|25_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-07-24T16:55:06.473074.parquet' - config_name: harness_drop_3 data_files: - split: 2023_09_23T04_03_10.692358 path: - '**/details_harness|drop|3_2023-09-23T04-03-10.692358.parquet' - split: latest path: - '**/details_harness|drop|3_2023-09-23T04-03-10.692358.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_09_23T04_03_10.692358 path: - '**/details_harness|gsm8k|5_2023-09-23T04-03-10.692358.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-09-23T04-03-10.692358.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hellaswag|10_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-management|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-management|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-07-24T16:55:06.473074.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-management|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-virology|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:55:06.473074.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_07_24T16_55_06.473074 path: - '**/details_harness|truthfulqa:mc|0_2023-07-24T16:55:06.473074.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-07-24T16:55:06.473074.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_09_23T04_03_10.692358 path: - '**/details_harness|winogrande|5_2023-09-23T04-03-10.692358.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-09-23T04-03-10.692358.parquet' - config_name: results data_files: - split: 2023_07_24T16_55_06.473074 path: - results_2023-07-24T16:55:06.473074.parquet - split: 2023_09_23T04_03_10.692358 path: - results_2023-09-23T04-03-10.692358.parquet - split: latest path: - results_2023-09-23T04-03-10.692358.parquet --- # Dataset Card for Evaluation run of LoupGarou/WizardCoder-Guanaco-15B-V1.0 ## Dataset Description - **Homepage:** - **Repository:** https://huggingface.co/LoupGarou/WizardCoder-Guanaco-15B-V1.0 - **Paper:** - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard - **Point of Contact:** clementine@hf.co ### Dataset Summary Dataset automatically created during the evaluation run of model [LoupGarou/WizardCoder-Guanaco-15B-V1.0](https://huggingface.co/LoupGarou/WizardCoder-Guanaco-15B-V1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_LoupGarou__WizardCoder-Guanaco-15B-V1.0", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-09-23T04:03:10.692358](https://huggingface.co/datasets/open-llm-leaderboard/details_LoupGarou__WizardCoder-Guanaco-15B-V1.0/blob/main/results_2023-09-23T04-03-10.692358.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "em": 0.04089765100671141, "em_stderr": 0.0020282491887764946, "f1": 0.08708682885906038, "f1_stderr": 0.002301893268858503, "acc": 0.27279042923742786, "acc_stderr": 0.008653599278888232 }, "harness|drop|3": { "em": 0.04089765100671141, "em_stderr": 0.0020282491887764946, "f1": 0.08708682885906038, "f1_stderr": 0.002301893268858503 }, "harness|gsm8k|5": { "acc": 0.014404852160727824, "acc_stderr": 0.0032820559171369513 }, "harness|winogrande|5": { "acc": 0.5311760063141279, "acc_stderr": 0.014025142640639511 } } ``` ### Supported Tasks and Leaderboards [More Information Needed] ### Languages [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
REDTMX/J
--- license: apache-2.0 ---
DeadBeast/dreambooth-images
--- license: openrail dataset_info: features: - name: image dtype: image splits: - name: train num_bytes: 27360897.0 num_examples: 11 download_size: 27356699 dataset_size: 27360897.0 ---
chronbmm/sanskrit-stemming-tagging
--- dataset_info: features: - name: sentence dtype: string - name: unsandhied dtype: string splits: - name: train num_bytes: 251165593 num_examples: 626782 - name: validation num_bytes: 3735346 num_examples: 9453 - name: test num_bytes: 3598838 num_examples: 9026 - name: test_long_500 num_bytes: 193539 num_examples: 500 - name: validation_long_500 num_bytes: 193658 num_examples: 500 download_size: 72753688 dataset_size: 258886974 --- # Dataset Card for "sanskrit-stemming-tagging" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
muibk/wmt21_metrics_task
--- annotations_creators: - expert-generated language_creators: - found - machine-generated - expert-generated language: - bn-hi - cs-en - de-en - de-fr - en-cs - en-de - en-ha - en-is - en-ja - en-ru - en-zh - fr-de - ha-en - hi-bn - is-en - ja-en - ru-en - xh-zh - zh-en - zu-xh license: - unknown multilinguality: - translation paperswithcode_id: null pretty_name: WMT21 Metrics Shared Task size_categories: - 100K<n<1M source_datasets: [] task_categories: - translation task_ids: [] --- # Dataset Card for WMT21 Metrics Task ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards) - [Languages](#languages) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Source Data](#source-data) - [Annotations](#annotations) - [Personal and Sensitive Information](#personal-and-sensitive-information) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Other Known Limitations](#other-known-limitations) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Licensing Information](#licensing-information) - [Citation Information](#citation-information) - [Contributions](#contributions) ## Dataset Description - **Homepage:** [WMT21 Metrics Shared Task](https://www.statmt.org/wmt21/metrics-task.html) - **Repository:** [MT Metrics Eval Github Repository](https://github.com/google-research/mt-metrics-eval) - **Paper:** [Paper](https://aclanthology.org/2021.wmt-1.73/) ### Dataset Summary [More Information Needed] ### Supported Tasks and Leaderboards [More Information Needed] ### Languages The dataset comprises twenty language pairs: - Bengali-Hindi (`bn-hi`) - Czech-English (`cs-en`) - German-English (`de-en`) - German-French (`de-fr`) - English-Czech (`en-cs`) - English-German (`en-de`) - English-Hausa (`en-ha`) - English-Icelandic (`en-is`) - English-Japanese (`en-ja`) - English-Russian (`en-ru`) - English-Chinese (`en-zh`) - French-German (`fr-de`) - Hausa-English (`ha-en`) - Hindi-Bengali (`hi-bn`) - Icelandic-English (`is-en`) - Japenese-English (`ja-en`) - Russian-English (`ru-en`) - Xhosa-Zulu (`xh-zu`) - Chinese-English (`zh-en`) - Zulu-Xhosa (`zu-xh`) ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset.
jxm/climate-fever__gtr_base__dpr
--- dataset_info: features: - name: text dtype: string - name: embeddings_A sequence: float32 - name: embeddings_B sequence: float32 splits: - name: train num_bytes: 653714421 num_examples: 100000 download_size: 758665016 dataset_size: 653714421 configs: - config_name: default data_files: - split: train path: data/train-* ---
Nerfgun3/bad_prompt
--- language: - en license: creativeml-openrail-m thumbnail: "https://huggingface.co/datasets/Nerfgun3/bad_prompt/resolve/main/bad_prompt_showcase.jpg" tags: - stable-diffusion - text-to-image - image-to-image inference: false --- # Negative Embedding / Textual Inversion <img alt="Showcase" src="https://huggingface.co/datasets/Nerfgun3/bad_prompt/resolve/main/bad_prompt_showcase.jpg"/> ## Idea The idea behind this embedding was to somehow train the negative prompt as an embedding, thus unifying the basis of the negative prompt into one word or embedding. Side note: Embedding has proven to be very helpful for the generation of hands! :) ## Usage To use this embedding you have to download the file aswell as drop it into the "\stable-diffusion-webui\embeddings" folder. **Please put the embedding in the negative prompt to get the right results!** For special negative tags such as "malformed sword", you still need to add them yourself. The negative embedding is trained on a basic skeleton for the negative prompt, which should provide a high-resolution image as a result. ### Version 1: Issue: Changing the style to much. To use it in the negative prompt: ```"bad_prompt"``` Personally, I would recommend to use my embeddings with a strength of 0.8 even the negative embeddings, like ```"(bad_prompt:0.8)"``` ### Version 2: With this version I tried to reduce the amount of vectors used, aswell as the issue with the changing artstyle. The newer version is still a work in progress, but its already way better than the first version. Its in files section! I hope you enjoy the embedding. If you have any questions, you can ask me anything via Discord: "Nerfgun3#7508" ## License This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage. The CreativeML OpenRAIL License specifies: 1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content 2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license 3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully) [Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license)
mii-llm/oasst2-dpo-it
--- dataset_info: features: - name: prompt dtype: string - name: chosen dtype: string - name: rejected dtype: string splits: - name: train num_bytes: 376043 num_examples: 210 download_size: 193646 dataset_size: 376043 configs: - config_name: default data_files: - split: train path: data/train-* ---
Avatarofhemant/News
--- license: apache-2.0 task_categories: - zero-shot-classification - text-classification language: - en tags: - art - music ---
One-2-3-45/training_data
--- license: openrail tags: - text-to-3d - image-to-3d ---
elliotthwang/guanaco-llama2-chinese-1k
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 1348677 num_examples: 1000 download_size: 0 dataset_size: 1348677 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "guanaco-llama2-chinese-1k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
pbaoo2705/qa_processed_eval
--- dataset_info: features: - name: pubid dtype: int32 - name: question dtype: string - name: context dtype: string - name: long_answer dtype: string - name: final_decision dtype: string - name: text dtype: string - name: answers dtype: string splits: - name: test num_bytes: 1930174 num_examples: 1000 download_size: 1132172 dataset_size: 1930174 configs: - config_name: default data_files: - split: test path: data/test-* --- # Dataset Card for "qa_processed_eval" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
starsnatched/test
--- license: apache-2.0 ---
christianloyal/loyal_clinc_MLE
--- license: mit --- Dataset for Loyal Health Inc Software Engineer Machine Learning Interview