datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
liuyanchen1015/MULTI_VALUE_stsb_participle_past_tense | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 9502
num_examples: 45
- name: test
num_bytes: 9265
num_examples: 39
- name: train
num_bytes: 47094
num_examples: 208
download_size: 53657
dataset_size: 65861
---
# Dataset Card for "MULTI_VALUE_stsb_participle_past_tense"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Minglii/A_QthenA_4096 | ---
dataset_info:
features:
- name: data
struct:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 359881748
num_examples: 52002
download_size: 119164182
dataset_size: 359881748
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "A_QthenA_4096"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
michaelnath/bad_code_to_good_code_dataset | ---
dataset_info:
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 2905897365
num_examples: 2786238
download_size: 550189166
dataset_size: 2905897365
---
# Dataset Card for "bad_code_to_good_code_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-xsum-default-21f5cd-15036097 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: sshleifer/distilbart-xsum-9-6
metrics: ['accuracy']
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: sshleifer/distilbart-xsum-9-6
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Rohil](https://huggingface.co/Rohil) for evaluating this model. |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/d9292a47 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 184
num_examples: 10
download_size: 1342
dataset_size: 184
---
# Dataset Card for "d9292a47"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Kaina99/Delegado | ---
license: openrail
---
|
Ksingleton/KBase_SDK_Docs_Orig | ---
license: apache-2.0
---
|
nataliaElv/similarity-qa-with-vectors | ---
size_categories: 1K<n<10K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for similarity-qa-with-vectors
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("nataliaElv/similarity-qa-with-vectors")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("nataliaElv/similarity-qa-with-vectors")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| instruction | Instruction | text | True | False |
| input | Input | text | False | False |
| output | Output | text | True | False |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| quality | Rate the quality of the record: | rating | True | N/A | [1, 2, 3, 4, 5] |
| explanation | Explain your rating: | text | True | N/A | N/A |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
**✨ NEW** The **vectors** are different columns that contain a vector in floating point, which is constraint to the pre-defined dimensions in the **vectors_settings** when configuring the vectors within the dataset itself, also the dimensions will always be 1-dimensional. The **vectors** are optional and identified by the pre-defined vector name in the dataset configuration file in `argilla.yaml`.
| Vector Name | Title | Dimensions |
|-------------|-------|------------|
| input | Input | [1, 384] |
| instruction | Instruction | [1, 384] |
| output | Output | [1, 384] |
| testing | EMPTY! | [1, 1] |
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
| text_length | text_length | integer | None - None | True |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"input": "",
"instruction": "Give three tips for staying healthy.",
"output": "1. Eat a balanced diet and make sure to include plenty of fruits and vegetables. \n2. Exercise regularly to keep your body active and strong. \n3. Get enough sleep and maintain a consistent sleep schedule."
},
"metadata": {
"text_length": 241
},
"responses": [],
"suggestions": [],
"vectors": {
"input": [
-0.025378959253430367,
-0.005421411711722612,
-0.005123426206409931,
-0.015000881627202034,
-0.010828345082700253,
0.011933867819607258,
0.019314972683787346,
0.040846794843673706,
-0.009248972870409489,
0.015658004209399223,
0.0018413026118651032,
-0.04884575679898262,
0.007001905702054501,
0.03489101678133011,
0.035010259598493576,
0.004000979475677013,
0.03179853782057762,
0.013713518157601357,
-0.01575734093785286,
0.016500428318977356,
0.02162296697497368,
-0.019962908700108528,
0.011788141913712025,
-0.018135597929358482,
0.00479349447414279,
0.027265621349215508,
-0.00592863280326128,
-0.00819356832653284,
-0.04846194013953209,
-0.19176225364208221,
-0.033277515321969986,
-0.013714526779949665,
0.0032154761720448732,
-0.009890320710837841,
-0.010387021116912365,
-0.009758984670042992,
-0.01616772636771202,
0.013864913955330849,
-0.010939724743366241,
0.04058735817670822,
0.021671248599886894,
0.01383791770786047,
-0.01536033395677805,
-0.010618588887155056,
0.005697894841432571,
-0.02265983633697033,
-0.016780417412519455,
-0.006693877745419741,
0.05799293890595436,
-0.006326382048428059,
0.002093177754431963,
0.010354680009186268,
0.0006329257157631218,
0.027090711519122124,
0.004488569684326649,
0.014552658423781395,
0.0180455781519413,
0.019452394917607307,
0.02411177195608616,
0.008954178541898727,
0.0015302742831408978,
0.029447568580508232,
-0.16580072045326233,
0.02812054567039013,
0.009662247262895107,
0.009475956670939922,
0.013372445479035378,
-0.016405431553721428,
-0.001572685199789703,
0.051213230937719345,
0.003518211655318737,
0.015949634835124016,
-0.0069265239872038364,
0.027317708358168602,
0.019327018409967422,
-0.022707704454660416,
0.028689151629805565,
-0.01890380308032036,
-0.01167482603341341,
0.011035646311938763,
0.0040340544655919075,
-0.012239952571690083,
-0.006184910889714956,
-0.005307812709361315,
-0.03035779856145382,
-0.041286271065473557,
0.010543900541961193,
0.014870839193463326,
0.00642419932410121,
0.01750650443136692,
-0.024431902915239334,
-0.0055658514611423016,
0.02791532501578331,
0.007770954631268978,
-0.06280053406953812,
-0.011230005882680416,
0.022709796205163002,
0.0036207374650985003,
-0.032403528690338135,
0.7040055990219116,
-0.018570110201835632,
0.00400574691593647,
0.03399886190891266,
-0.049098845571279526,
0.0239898469299078,
-0.01194965373724699,
-0.018013538792729378,
-0.012237226590514183,
-0.008749520406126976,
0.0011163142044097185,
0.025379084050655365,
-0.009777436032891273,
0.04108814150094986,
-0.005716001149266958,
0.006996306125074625,
0.01101826224476099,
0.043749451637268066,
0.025922292843461037,
-0.006995497737079859,
-0.031284742057323456,
-0.03961759805679321,
0.024092240259051323,
-0.0037946782540529966,
-0.016933923587203026,
0.009725619107484818,
-0.09440258890390396,
0.008375165052711964,
0.04419294372200966,
0.01720806583762169,
0.025360679253935814,
0.024841418489813805,
-0.037821535021066666,
-0.002577421488240361,
-0.008712586015462875,
0.007797832600772381,
-0.0038116704672574997,
0.019269822165369987,
-0.026785872876644135,
0.04632653668522835,
-0.01628199592232704,
-0.031312331557273865,
-0.06490401178598404,
0.015363720245659351,
-0.06325960904359818,
-0.025076331570744514,
0.043549794703722,
0.0021469779312610626,
-0.01139114424586296,
-0.019525835290551186,
0.01321511808782816,
0.014193642884492874,
-0.0003590172855183482,
0.006383916363120079,
-0.0230486411601305,
0.01811799593269825,
0.008996100164949894,
0.03565937653183937,
0.004165417980402708,
-0.04827389121055603,
0.009129678830504417,
-0.020495550706982613,
-0.0036268446128815413,
-0.012152481824159622,
0.04790886864066124,
0.022871557623147964,
-0.052697136998176575,
-0.024344727396965027,
0.00391955254599452,
0.02152823470532894,
-0.021536199375987053,
0.0035667491611093283,
0.017030438408255577,
-0.018038615584373474,
0.0029417292680591345,
0.060567457228899,
0.007039966527372599,
-0.036729853600263596,
-0.017760826274752617,
-0.003907470498234034,
0.00815458782017231,
0.013006726279854774,
-0.02316906675696373,
-0.043683670461177826,
0.003448701463639736,
0.015315227210521698,
-0.04293462261557579,
-0.06704577058553696,
-0.0008262014016509056,
0.010253406129777431,
0.030316654592752457,
-0.026838993653655052,
0.028824586421251297,
-0.04089079424738884,
0.010620318353176117,
-0.01844465360045433,
-0.031399376690387726,
-0.029419098049402237,
-0.006011322606354952,
-0.01602524146437645,
-0.027820106595754623,
0.037589482963085175,
-0.025692598894238472,
-0.03817908838391304,
0.045563384890556335,
-0.004913593642413616,
0.03158273175358772,
0.006058005150407553,
-0.013417067006230354,
0.03615306690335274,
0.0157751627266407,
-0.028747329488396645,
-0.016211561858654022,
0.06901752948760986,
0.014868182130157948,
-0.018051955848932266,
0.004836737178266048,
0.01713799685239792,
0.019215645268559456,
-0.010181054472923279,
0.005242344457656145,
0.010851659812033176,
-0.0026485237758606672,
-0.001141647924669087,
-0.24576711654663086,
-0.002747960388660431,
-0.013538523577153683,
-0.01295738760381937,
0.010152100585401058,
-0.04176444187760353,
0.023869045078754425,
-0.007760809734463692,
-0.012792426161468029,
0.06321337074041367,
0.03212174028158188,
-0.01926518976688385,
-0.03867725655436516,
0.004653181880712509,
0.002605821006000042,
0.0397784523665905,
-0.017472509294748306,
-0.01268637552857399,
-0.014788305386900902,
-0.0140827726572752,
-0.004600161220878363,
0.024699149653315544,
-0.04971880093216896,
-0.013434397988021374,
0.04086251184344292,
-0.011998802423477173,
0.1687333583831787,
0.06002860143780708,
0.05463676527142525,
0.009981472045183182,
0.026445526629686356,
-0.002445181366056204,
0.004810625687241554,
-0.0822978988289833,
-0.006071159150451422,
0.023768611252307892,
0.009113253094255924,
-0.01965516433119774,
-0.032944176346063614,
-0.01938377507030964,
-0.029378263279795647,
0.0029978558886796236,
-0.03449537232518196,
-0.04050493985414505,
-0.010722795501351357,
-0.021573275327682495,
-0.003244556486606598,
0.04474780336022377,
-0.006164703518152237,
0.007450612727552652,
0.019294289872050285,
-0.0168308112770319,
0.02694232389330864,
0.0011298403842374682,
0.013066732324659824,
-0.025697633624076843,
-0.05993640422821045,
-0.01706899330019951,
0.0029229209758341312,
0.034100666642189026,
0.01385537814348936,
0.0075796437449753284,
0.013853371143341064,
-0.03558618947863579,
0.024463411420583725,
0.020203080028295517,
-0.016664505004882812,
-0.036146968603134155,
0.009001891128718853,
-0.001127164694480598,
0.0020711671095341444,
0.038990166038274765,
-0.0030310722067952156,
-0.013261590152978897,
0.015158350579440594,
0.01787375845015049,
0.02150031551718712,
-0.01414579339325428,
-0.01087750494480133,
-0.031551484018564224,
0.03693791851401329,
-0.04876874387264252,
0.033198032528162,
-0.0008998148841783404,
0.025022761896252632,
0.02153155766427517,
0.031150564551353455,
0.012694449163973331,
0.025216665118932724,
-0.025036532431840897,
-0.01528647355735302,
0.02587883174419403,
0.007003279402852058,
-0.038931142538785934,
0.0093992343172431,
-0.0352291576564312,
-0.29382655024528503,
0.008656660094857216,
0.03427589684724808,
0.007515639066696167,
-0.020472051575779915,
0.01940544880926609,
-0.004676192067563534,
-0.012927103787660599,
-0.06322138011455536,
0.012437527999281883,
-0.00783091876655817,
0.04212547093629837,
0.005131952930241823,
-0.0050582909025251865,
0.01414374727755785,
0.00809974130243063,
0.052983805537223816,
-0.04053438827395439,
0.0057920170947909355,
0.00970305223017931,
0.010941924527287483,
0.03179527446627617,
0.15323057770729065,
0.005643048323690891,
0.006959667429327965,
0.00013757664419244975,
-1.6971631566775613e-06,
0.007552433293312788,
-0.012880627997219563,
-0.02042868547141552,
0.023718440905213356,
0.0022838576696813107,
0.013517720624804497,
-0.02175792306661606,
-0.0009210885618813336,
0.015774134546518326,
-0.015349329449236393,
0.05633925646543503,
0.011824986897408962,
-0.00390510237775743,
-0.01163121871650219,
0.018930673599243164,
-0.028179243206977844,
-0.01438893098384142,
0.04144846647977829,
-0.02175223082304001,
-0.013296201825141907,
-0.027249742299318314,
0.01321756187826395,
0.004708074498921633,
-0.01436836551874876,
-0.00868219044059515,
-0.03285142034292221,
0.00456952303647995,
0.026425881311297417,
0.029904771596193314,
0.0017084190621972084,
-0.03230232000350952,
-0.012617474421858788,
-0.0292427409440279,
-0.0033859421964734793,
-0.039590779691934586,
-0.012841294519603252,
0.008753335103392601,
0.024074239656329155
],
"instruction": [
-0.028294799849390984,
0.011423577554523945,
0.036473676562309265,
0.014384294860064983,
0.033650998026132584,
0.044261567294597626,
0.054745864123106,
-0.006785567384213209,
-0.033210258930921555,
-0.004255346488207579,
-0.009542741812765598,
-0.06505352258682251,
0.020041724666953087,
0.005884387064725161,
0.023006301373243332,
0.009341963566839695,
0.013288628309965134,
0.020596183836460114,
-0.08866936713457108,
0.020365161821246147,
0.0039667654782533646,
-0.009743105620145798,
-0.01038470771163702,
0.03891463950276375,
0.04199279844760895,
-0.0015471188817173243,
0.017346808686852455,
-0.0009712407481856644,
-0.04156488552689552,
-0.08327898383140564,
-0.0057123564183712006,
-0.03304611146450043,
-0.014200099743902683,
-0.05025415122509003,
-0.03943734988570213,
-0.004892001859843731,
-0.03231222182512283,
0.0428633950650692,
-0.014364161528646946,
0.028033988550305367,
0.029457736760377884,
0.04017244279384613,
0.006841784808784723,
-0.040735870599746704,
-0.029939744621515274,
0.012204443104565144,
-0.007098079193383455,
0.00870603322982788,
0.11606526374816895,
-0.036759164184331894,
-0.0226057730615139,
-0.036044806241989136,
0.0036027065943926573,
0.013453883118927479,
0.04608047008514404,
0.023612817749381065,
0.04989304393529892,
0.021902846172451973,
-0.011633052490651608,
0.022141549736261368,
0.0015894151292741299,
0.05768429487943649,
-0.13913792371749878,
0.09128717333078384,
0.05641337111592293,
0.019719669595360756,
-0.0036547910422086716,
0.027990108355879784,
0.02811155840754509,
0.05238833278417587,
-0.051842622458934784,
0.00869784690439701,
0.047691915184259415,
0.08334841579198837,
-0.013117673806846142,
0.0003201559593435377,
8.419524237979203e-05,
-0.047789316624403,
0.007370067294687033,
0.012363560497760773,
-0.0031371808145195246,
0.004051032476127148,
0.03236107900738716,
-0.00014329193800222129,
-0.03795674443244934,
-0.05034990608692169,
0.0012106086360290647,
-0.02104824408888817,
-0.01086430437862873,
-0.0032299929298460484,
-0.03353969752788544,
-0.02411346323788166,
-0.013411669991910458,
-0.008812682703137398,
-0.03013485111296177,
-0.02009817771613598,
0.013921807520091534,
-0.0014325721422210336,
-0.08131148666143417,
0.49421632289886475,
-0.035576775670051575,
0.004619543440639973,
0.04555949568748474,
-0.03806179016828537,
-0.00013336131814867258,
-0.0585651732981205,
-0.004194274544715881,
-0.04407230019569397,
0.0060032126493752,
0.017919957637786865,
0.038701072335243225,
0.006876757834106684,
0.026622561737895012,
-0.04567642882466316,
0.008537987247109413,
0.05237656459212303,
0.058949194848537445,
-0.010129952803254128,
0.006053665652871132,
-0.0360737070441246,
-0.00012246076948940754,
-0.01624143123626709,
0.04624427482485771,
0.016046954318881035,
0.01543157547712326,
-0.07732819765806198,
0.020039336755871773,
0.08234314620494843,
0.024840623140335083,
0.07069077342748642,
0.03989553451538086,
-0.008631136268377304,
-0.04507581144571304,
-0.021784601733088493,
-0.019229508936405182,
-0.0377168208360672,
-0.00907179806381464,
0.007771300617605448,
0.0638672485947609,
-0.037488069385290146,
0.010747137479484081,
-0.0771736204624176,
0.0011695214780047536,
-0.15691794455051422,
0.01227235421538353,
0.08721382170915604,
0.010133261792361736,
0.011185379698872566,
-0.0003329571627546102,
-0.035310111939907074,
0.028933368623256683,
0.01930142007768154,
0.023206880316138268,
-0.021107438951730728,
0.011363566853106022,
0.019187364727258682,
0.061094529926776886,
-0.02340521477162838,
-0.044954899698495865,
0.023971516638994217,
-0.03909078612923622,
-0.02852088212966919,
-0.05408242344856262,
0.02216211147606373,
0.02574894391000271,
-0.061800774186849594,
0.02115439437329769,
-0.014760923571884632,
-0.021337050944566727,
-0.030090905725955963,
0.03165484219789505,
0.010940002277493477,
-0.030419979244470596,
0.016559641808271408,
0.1038859412074089,
0.010971220210194588,
-0.017817728221416473,
0.024410080164670944,
0.048050910234451294,
0.03745235875248909,
0.029996581375598907,
-0.02943653240799904,
-0.03860842064023018,
-0.006290885154157877,
0.021293338388204575,
-0.037811633199453354,
-0.028443869203329086,
-0.02794715203344822,
0.01686839759349823,
0.024009251967072487,
-0.04409133270382881,
-0.034815020859241486,
-0.0571603886783123,
0.006067034788429737,
-0.054615240544080734,
-0.017149915918707848,
-0.053462423384189606,
0.008000087924301624,
-0.0056595285423099995,
0.021827906370162964,
0.010296478867530823,
-0.032147377729415894,
0.01693701185286045,
0.060458943247795105,
0.032488394528627396,
-0.005344670731574297,
0.006128267385065556,
-0.05759930983185768,
0.09644084423780441,
0.04614530876278877,
-0.005628999788314104,
0.044932443648576736,
0.08120793849229813,
-0.004347485024482012,
-0.012239634990692139,
0.002865704009309411,
0.048002783209085464,
0.024727527052164078,
-0.04996514320373535,
0.03530338034033775,
0.06366550922393799,
0.001856555580161512,
-0.026324912905693054,
-0.2594948709011078,
-0.009771975688636303,
0.0008549326448701322,
-0.02988940291106701,
0.05125589296221733,
-0.005975003354251385,
0.04681583121418953,
-0.02462996169924736,
-0.003171533579006791,
0.030998433008790016,
0.04220284894108772,
-0.06249793618917465,
-0.018484387546777725,
-0.025125399231910706,
-0.031116079539060593,
0.03367865830659866,
0.024016913026571274,
0.022654514759778976,
-0.03607874736189842,
-0.008408503606915474,
-0.003392603015527129,
-0.003394088940694928,
0.027715187519788742,
-0.04477202147245407,
0.0698717013001442,
-0.025311101227998734,
0.19767506420612335,
0.03268478438258171,
-0.020445292815566063,
-0.001494147116318345,
0.021007763221859932,
-0.014887446537613869,
-0.009454205632209778,
-0.1457635909318924,
0.044506173580884933,
0.036704450845718384,
-0.009630566462874413,
-0.033140044659376144,
-0.06801245361566544,
-0.04351450130343437,
-0.0015428439946845174,
0.040272679179906845,
-0.029273075982928276,
-0.04262491688132286,
-0.08079501986503601,
-0.09341204911470413,
-0.008734806440770626,
0.022583896294236183,
-0.00786470714956522,
-0.01965724304318428,
0.03771577775478363,
0.02482139877974987,
-0.00796583667397499,
-0.021494679152965546,
0.0003737947263289243,
-0.005912111606448889,
-0.08274369686841965,
0.01637021079659462,
-0.061421725898981094,
0.004829832818359137,
-0.038608115166425705,
-0.031537361443042755,
0.03917916864156723,
-0.005888978950679302,
0.042416442185640335,
-0.03242899104952812,
-0.0019902149215340614,
-0.008614087477326393,
0.02787523716688156,
-0.0027530998922884464,
-0.01158168725669384,
-0.03733321651816368,
-0.022949688136577606,
-0.04117552191019058,
0.029149919748306274,
0.008193145506083965,
-0.01992698200047016,
-0.009709509089589119,
-0.0008522608550265431,
-0.02838626690208912,
-0.0007495849276892841,
-0.024111248552799225,
0.043145451694726944,
0.026467004790902138,
0.030171573162078857,
-0.027401920408010483,
0.012965810485184193,
-0.0112138157710433,
0.05209830775856972,
0.0041970144957304,
-0.01941954903304577,
0.024340109899640083,
-0.015719756484031677,
-0.040586892515420914,
0.03577498346567154,
-0.01582811400294304,
-0.32217153906822205,
-0.01248727086931467,
-0.031408052891492844,
0.04412638023495674,
-0.01473877765238285,
0.027508625760674477,
0.0105436434969306,
0.021679112687706947,
-0.059311412274837494,
0.030258214101195335,
0.029154274612665176,
0.02299693413078785,
0.012399664148688316,
-0.03594256937503815,
0.01200439315289259,
0.01518949493765831,
0.04612886905670166,
-0.012303836643695831,
0.01768031343817711,
-0.028339799493551254,
0.03841863572597504,
0.015603763051331043,
0.17313213646411896,
0.00765720522031188,
0.025147512555122375,
0.06211095303297043,
0.020088262856006622,
0.036083098500967026,
-0.02794945240020752,
0.01467492152005434,
0.03732845187187195,
0.0015544397756457329,
0.03362180292606354,
0.0004458639014046639,
0.05182427167892456,
0.014795232564210892,
-0.049910202622413635,
-0.017119428142905235,
-0.006229814607650042,
0.006820270791649818,
0.004686770029366016,
-0.019537806510925293,
-0.006466642487794161,
-0.014810604974627495,
0.0433243028819561,
-0.087904192507267,
-0.018105102702975273,
-0.006181234493851662,
0.009212393313646317,
-0.025008780881762505,
-0.04318060353398323,
0.009955020621418953,
-1.417224029864883e-05,
-0.0017516168300062418,
0.0240207239985466,
0.004205706063657999,
-0.015509650111198425,
-0.014545624144375324,
-0.007549952249974012,
-0.002600432839244604,
-0.003538699122145772,
-0.0565103143453598,
-0.023127390071749687,
0.03355006128549576,
0.02841961942613125
],
"output": [
-0.0176240187138319,
-0.004716125782579184,
0.006949670612812042,
-0.019411537796258926,
0.01567736640572548,
0.03834877535700798,
0.029715606942772865,
-0.005319108720868826,
-0.024479160085320473,
0.01141896191984415,
-0.03195776045322418,
-0.044211603701114655,
0.015640120953321457,
0.00799254234880209,
0.04004312679171562,
-0.03116476722061634,
0.01872282661497593,
0.007150548510253429,
-0.044442109763622284,
0.023735281080007553,
-0.0381082259118557,
0.006544189061969519,
-0.005491169635206461,
0.04239990934729576,
0.0483052060008049,
-0.003837290685623884,
0.004100733436644077,
-0.0035680716391652822,
-0.04005947709083557,
-0.1662059724330902,
-0.021250028163194656,
-0.017237992957234383,
0.027853040024638176,
-0.010291568003594875,
-0.03665821626782417,
-0.019267022609710693,
-0.05113506689667702,
0.021914202719926834,
-0.0053786118514835835,
0.052501581609249115,
0.05814570561051369,
0.010980119928717613,
0.011934413574635983,
-0.03319042921066284,
0.024707715958356857,
0.03283394128084183,
-0.027020296081900597,
0.015744253993034363,
0.1453917771577835,
-0.0009102760814130306,
0.0257569532841444,
-0.009953656233847141,
0.03505204617977142,
0.013196351006627083,
0.013851557858288288,
0.01405735220760107,
0.04858440160751343,
0.043133508414030075,
0.025982405990362167,
0.024609744548797607,
0.03449631854891777,
0.06881494075059891,
-0.14141228795051575,
0.07843177020549774,
0.006140443496406078,
0.011221753433346748,
-0.03221449255943298,
0.003841140540316701,
0.04540587589144707,
0.07630753517150879,
-0.04095779359340668,
0.032072145491838455,
0.034730445593595505,
0.06508953124284744,
-0.010172650218009949,
-0.022360237315297127,
0.029742486774921417,
-0.05383429676294327,
0.021864868700504303,
0.026357246562838554,
0.0022341941948980093,
0.005670070648193359,
0.027428705245256424,
0.01558087021112442,
-0.022692600265145302,
-0.017826268449425697,
-0.017297543585300446,
-0.04001662880182266,
-0.013342439197003841,
0.005840242374688387,
-0.030979914590716362,
-0.01724572479724884,
-0.024048786610364914,
0.0007530910661444068,
-0.04041028767824173,
-0.013380403630435467,
-0.020868824794888496,
0.0014694909332320094,
-0.08467952907085419,
0.580690324306488,
-0.004032657481729984,
0.05849768966436386,
0.004397968295961618,
0.010408908128738403,
-0.007641632575541735,
-0.02606048434972763,
-0.010855214670300484,
0.042734257876873016,
-0.011239683255553246,
0.019521446898579597,
0.0353398434817791,
0.03371568024158478,
0.023315098136663437,
-0.04357689991593361,
0.054250311106443405,
0.024925105273723602,
0.03813351318240166,
0.009227410890161991,
-0.01181033905595541,
-0.02605658583343029,
-0.0014186418848112226,
-0.01716744527220726,
0.060392748564481735,
-0.031001577153801918,
0.022377919405698776,
-0.052311576902866364,
0.02619396522641182,
0.06624552607536316,
0.033756088465452194,
0.02069525606930256,
0.048069678246974945,
0.020734800025820732,
-0.045360464602708817,
0.00034473047708161175,
-0.003251194953918457,
0.0029696219135075808,
-0.024894757196307182,
-0.040147315710783005,
0.03724127262830734,
-0.013771955855190754,
-0.01938561722636223,
-0.0677570104598999,
0.04945305734872818,
-0.13771581649780273,
-0.014849649742245674,
0.08132605254650116,
0.03424401208758354,
0.047011204063892365,
-0.029743820428848267,
-0.017329024150967598,
0.0065647610463202,
-0.004912176169455051,
-0.0041047860868275166,
-0.001819611992686987,
0.003893696703016758,
0.04609661549329758,
0.07918432354927063,
-0.005971760023385286,
-0.009723413735628128,
-0.03991689160466194,
-0.055766761302948,
0.0011914591304957867,
-0.04016687721014023,
0.014506371691823006,
0.03450341150164604,
-0.028972936794161797,
-0.019107777625322342,
-0.02772853896021843,
-0.005077740643173456,
-0.050147395581007004,
0.04929664358496666,
-0.006028413772583008,
-0.027990680187940598,
0.0063177370466291904,
0.09959689527750015,
0.0040030027739703655,
0.008515025489032269,
0.01651126518845558,
0.007028188090771437,
-0.0040906500071287155,
0.061634790152311325,
-0.06352051347494125,
-0.024248389527201653,
-0.015383181162178516,
0.026543831452727318,
-0.04806214198470116,
-0.02041323482990265,
-0.006135308649390936,
0.017517777159810066,
0.01024091336876154,
-0.0396764799952507,
0.00032474592444486916,
-0.07767457515001297,
-0.03382018581032753,
-0.050721388310194016,
-0.04525049775838852,
-0.046743810176849365,
-0.0227361973375082,
0.0029768687672913074,
-0.008010873571038246,
0.04743870720267296,
-0.0592835508286953,
-0.000213865379919298,
0.013153512962162495,
-0.0027194737922400236,
0.040022753179073334,
-0.003667823737487197,
-0.03996644914150238,
0.10362017899751663,
0.040296103805303574,
0.006855533923953772,
0.034558698534965515,
0.09931717813014984,
0.01592644862830639,
-0.04111484810709953,
0.03868898004293442,
0.025801660493016243,
0.02600902132689953,
0.020218197256326675,
0.03573831170797348,
0.06855560839176178,
-0.024668289348483086,
-0.05738578736782074,
-0.20787787437438965,
-0.02188975177705288,
0.00024557230062782764,
-0.016545936465263367,
0.03627012297511101,
-0.0286291241645813,
0.021834582090377808,
-0.051027849316596985,
-0.02713022194802761,
0.03910679742693901,
0.027725083753466606,
-0.03944699466228485,
-0.0265720933675766,
0.010004647076129913,
-0.044192299246788025,
0.0589180663228035,
0.017603101208806038,
0.011887047439813614,
-0.004177481401711702,
-0.017357105389237404,
0.022585542872548103,
-0.04550827294588089,
-0.005166847724467516,
-0.03875027596950531,
0.04337921366095543,
-0.02990945801138878,
0.1599227786064148,
0.007562293205410242,
0.01887401193380356,
0.0034982352517545223,
0.038471926003694534,
-0.01028207316994667,
-0.01702789030969143,
-0.1373232752084732,
0.025452803820371628,
-0.004819933325052261,
-0.022516997531056404,
-0.039742611348629,
-0.0214276984333992,
0.004858740605413914,
-0.02375420369207859,
0.008609913289546967,
-0.048986759036779404,
-0.008950253948569298,
-0.08449315279722214,
-0.026036735624074936,
-0.011276411823928356,
0.019190000370144844,
0.022633947432041168,
-0.040058594197034836,
0.04388629272580147,
0.031580597162246704,
0.01809006556868553,
-0.0030303390230983496,
-0.046913061290979385,
-0.005649161525070667,
-0.08374624699354172,
-0.0017140369163826108,
-0.03677505627274513,
0.0037890963722020388,
-0.04737789183855057,
-0.03700713813304901,
-0.003818045137450099,
0.009253373369574547,
0.05104338750243187,
0.002676995936781168,
-0.007246305234730244,
-0.02757888101041317,
-0.004555205814540386,
0.009329759515821934,
-0.03360270708799362,
0.0012659220956265926,
-0.04205852746963501,
-0.013427188619971275,
0.05246889963746071,
0.01665373146533966,
-0.0010568611323833466,
-0.009995844215154648,
0.001530607114546001,
-0.03711310774087906,
-0.013399848714470863,
-0.044277820736169815,
0.029894111678004265,
0.053477853536605835,
0.0006499383598566055,
-0.025698142126202583,
0.017410481348633766,
-0.02152882143855095,
0.03427017107605934,
-0.003780169179663062,
0.01745775155723095,
-0.019152434542775154,
-0.032994236797094345,
-0.04189195856451988,
0.0359673835337162,
-0.012284714728593826,
-0.2722877562046051,
0.01217364240437746,
-0.034562528133392334,
0.04906000941991806,
-0.022975552827119827,
0.016820218414068222,
0.035448189824819565,
0.003748809453099966,
-0.06079091504216194,
0.041749920696020126,
0.025426866486668587,
0.005696199368685484,
0.016359742730855942,
-0.013725521042943,
0.012083960697054863,
-0.0012312465114519,
0.015471730381250381,
-0.011446741409599781,
0.025711320340633392,
-0.01745632477104664,
0.003108542412519455,
-0.007072076201438904,
0.1382468193769455,
-0.019911963492631912,
0.009592660702764988,
0.05282985046505928,
0.030706806108355522,
0.00796398427337408,
-0.04627992957830429,
-0.009330281056463718,
0.010543725453317165,
-0.003632632317021489,
0.03128711134195328,
0.007868845015764236,
-0.010772887617349625,
-0.002463238313794136,
0.006074932403862476,
0.0006607716786675155,
0.018854515627026558,
-0.033307116478681564,
0.011923067271709442,
-0.018863312900066376,
-0.002774772234261036,
-0.0019064220832660794,
0.01827140338718891,
-0.08101121336221695,
-0.021098297089338303,
-0.02879447676241398,
-0.0014344346709549427,
-0.03157539293169975,
-0.03340595215559006,
-0.017963532358407974,
0.002487666206434369,
0.004737753886729479,
0.009413175284862518,
-0.004956235643476248,
-0.018281303346157074,
0.01808854751288891,
0.007452634163200855,
-0.0066879065707325935,
0.004095880780369043,
-0.05039187893271446,
-0.04649675264954567,
0.04778149351477623,
0.023173490539193153
]
}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"explanation": [],
"explanation-suggestion": null,
"explanation-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"external_id": null,
"input": "",
"instruction": "Give three tips for staying healthy.",
"metadata": "{\"text_length\": 241}",
"output": "1. Eat a balanced diet and make sure to include plenty of fruits and vegetables. \n2. Exercise regularly to keep your body active and strong. \n3. Get enough sleep and maintain a consistent sleep schedule.",
"quality": [],
"quality-suggestion": null,
"quality-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"vectors": {
"input": [
-0.025378959253430367,
-0.005421411711722612,
-0.005123426206409931,
-0.015000881627202034,
-0.010828345082700253,
0.011933867819607258,
0.019314972683787346,
0.040846794843673706,
-0.009248972870409489,
0.015658004209399223,
0.0018413026118651032,
-0.04884575679898262,
0.007001905702054501,
0.03489101678133011,
0.035010259598493576,
0.004000979475677013,
0.03179853782057762,
0.013713518157601357,
-0.01575734093785286,
0.016500428318977356,
0.02162296697497368,
-0.019962908700108528,
0.011788141913712025,
-0.018135597929358482,
0.00479349447414279,
0.027265621349215508,
-0.00592863280326128,
-0.00819356832653284,
-0.04846194013953209,
-0.19176225364208221,
-0.033277515321969986,
-0.013714526779949665,
0.0032154761720448732,
-0.009890320710837841,
-0.010387021116912365,
-0.009758984670042992,
-0.01616772636771202,
0.013864913955330849,
-0.010939724743366241,
0.04058735817670822,
0.021671248599886894,
0.01383791770786047,
-0.01536033395677805,
-0.010618588887155056,
0.005697894841432571,
-0.02265983633697033,
-0.016780417412519455,
-0.006693877745419741,
0.05799293890595436,
-0.006326382048428059,
0.002093177754431963,
0.010354680009186268,
0.0006329257157631218,
0.027090711519122124,
0.004488569684326649,
0.014552658423781395,
0.0180455781519413,
0.019452394917607307,
0.02411177195608616,
0.008954178541898727,
0.0015302742831408978,
0.029447568580508232,
-0.16580072045326233,
0.02812054567039013,
0.009662247262895107,
0.009475956670939922,
0.013372445479035378,
-0.016405431553721428,
-0.001572685199789703,
0.051213230937719345,
0.003518211655318737,
0.015949634835124016,
-0.0069265239872038364,
0.027317708358168602,
0.019327018409967422,
-0.022707704454660416,
0.028689151629805565,
-0.01890380308032036,
-0.01167482603341341,
0.011035646311938763,
0.0040340544655919075,
-0.012239952571690083,
-0.006184910889714956,
-0.005307812709361315,
-0.03035779856145382,
-0.041286271065473557,
0.010543900541961193,
0.014870839193463326,
0.00642419932410121,
0.01750650443136692,
-0.024431902915239334,
-0.0055658514611423016,
0.02791532501578331,
0.007770954631268978,
-0.06280053406953812,
-0.011230005882680416,
0.022709796205163002,
0.0036207374650985003,
-0.032403528690338135,
0.7040055990219116,
-0.018570110201835632,
0.00400574691593647,
0.03399886190891266,
-0.049098845571279526,
0.0239898469299078,
-0.01194965373724699,
-0.018013538792729378,
-0.012237226590514183,
-0.008749520406126976,
0.0011163142044097185,
0.025379084050655365,
-0.009777436032891273,
0.04108814150094986,
-0.005716001149266958,
0.006996306125074625,
0.01101826224476099,
0.043749451637268066,
0.025922292843461037,
-0.006995497737079859,
-0.031284742057323456,
-0.03961759805679321,
0.024092240259051323,
-0.0037946782540529966,
-0.016933923587203026,
0.009725619107484818,
-0.09440258890390396,
0.008375165052711964,
0.04419294372200966,
0.01720806583762169,
0.025360679253935814,
0.024841418489813805,
-0.037821535021066666,
-0.002577421488240361,
-0.008712586015462875,
0.007797832600772381,
-0.0038116704672574997,
0.019269822165369987,
-0.026785872876644135,
0.04632653668522835,
-0.01628199592232704,
-0.031312331557273865,
-0.06490401178598404,
0.015363720245659351,
-0.06325960904359818,
-0.025076331570744514,
0.043549794703722,
0.0021469779312610626,
-0.01139114424586296,
-0.019525835290551186,
0.01321511808782816,
0.014193642884492874,
-0.0003590172855183482,
0.006383916363120079,
-0.0230486411601305,
0.01811799593269825,
0.008996100164949894,
0.03565937653183937,
0.004165417980402708,
-0.04827389121055603,
0.009129678830504417,
-0.020495550706982613,
-0.0036268446128815413,
-0.012152481824159622,
0.04790886864066124,
0.022871557623147964,
-0.052697136998176575,
-0.024344727396965027,
0.00391955254599452,
0.02152823470532894,
-0.021536199375987053,
0.0035667491611093283,
0.017030438408255577,
-0.018038615584373474,
0.0029417292680591345,
0.060567457228899,
0.007039966527372599,
-0.036729853600263596,
-0.017760826274752617,
-0.003907470498234034,
0.00815458782017231,
0.013006726279854774,
-0.02316906675696373,
-0.043683670461177826,
0.003448701463639736,
0.015315227210521698,
-0.04293462261557579,
-0.06704577058553696,
-0.0008262014016509056,
0.010253406129777431,
0.030316654592752457,
-0.026838993653655052,
0.028824586421251297,
-0.04089079424738884,
0.010620318353176117,
-0.01844465360045433,
-0.031399376690387726,
-0.029419098049402237,
-0.006011322606354952,
-0.01602524146437645,
-0.027820106595754623,
0.037589482963085175,
-0.025692598894238472,
-0.03817908838391304,
0.045563384890556335,
-0.004913593642413616,
0.03158273175358772,
0.006058005150407553,
-0.013417067006230354,
0.03615306690335274,
0.0157751627266407,
-0.028747329488396645,
-0.016211561858654022,
0.06901752948760986,
0.014868182130157948,
-0.018051955848932266,
0.004836737178266048,
0.01713799685239792,
0.019215645268559456,
-0.010181054472923279,
0.005242344457656145,
0.010851659812033176,
-0.0026485237758606672,
-0.001141647924669087,
-0.24576711654663086,
-0.002747960388660431,
-0.013538523577153683,
-0.01295738760381937,
0.010152100585401058,
-0.04176444187760353,
0.023869045078754425,
-0.007760809734463692,
-0.012792426161468029,
0.06321337074041367,
0.03212174028158188,
-0.01926518976688385,
-0.03867725655436516,
0.004653181880712509,
0.002605821006000042,
0.0397784523665905,
-0.017472509294748306,
-0.01268637552857399,
-0.014788305386900902,
-0.0140827726572752,
-0.004600161220878363,
0.024699149653315544,
-0.04971880093216896,
-0.013434397988021374,
0.04086251184344292,
-0.011998802423477173,
0.1687333583831787,
0.06002860143780708,
0.05463676527142525,
0.009981472045183182,
0.026445526629686356,
-0.002445181366056204,
0.004810625687241554,
-0.0822978988289833,
-0.006071159150451422,
0.023768611252307892,
0.009113253094255924,
-0.01965516433119774,
-0.032944176346063614,
-0.01938377507030964,
-0.029378263279795647,
0.0029978558886796236,
-0.03449537232518196,
-0.04050493985414505,
-0.010722795501351357,
-0.021573275327682495,
-0.003244556486606598,
0.04474780336022377,
-0.006164703518152237,
0.007450612727552652,
0.019294289872050285,
-0.0168308112770319,
0.02694232389330864,
0.0011298403842374682,
0.013066732324659824,
-0.025697633624076843,
-0.05993640422821045,
-0.01706899330019951,
0.0029229209758341312,
0.034100666642189026,
0.01385537814348936,
0.0075796437449753284,
0.013853371143341064,
-0.03558618947863579,
0.024463411420583725,
0.020203080028295517,
-0.016664505004882812,
-0.036146968603134155,
0.009001891128718853,
-0.001127164694480598,
0.0020711671095341444,
0.038990166038274765,
-0.0030310722067952156,
-0.013261590152978897,
0.015158350579440594,
0.01787375845015049,
0.02150031551718712,
-0.01414579339325428,
-0.01087750494480133,
-0.031551484018564224,
0.03693791851401329,
-0.04876874387264252,
0.033198032528162,
-0.0008998148841783404,
0.025022761896252632,
0.02153155766427517,
0.031150564551353455,
0.012694449163973331,
0.025216665118932724,
-0.025036532431840897,
-0.01528647355735302,
0.02587883174419403,
0.007003279402852058,
-0.038931142538785934,
0.0093992343172431,
-0.0352291576564312,
-0.29382655024528503,
0.008656660094857216,
0.03427589684724808,
0.007515639066696167,
-0.020472051575779915,
0.01940544880926609,
-0.004676192067563534,
-0.012927103787660599,
-0.06322138011455536,
0.012437527999281883,
-0.00783091876655817,
0.04212547093629837,
0.005131952930241823,
-0.0050582909025251865,
0.01414374727755785,
0.00809974130243063,
0.052983805537223816,
-0.04053438827395439,
0.0057920170947909355,
0.00970305223017931,
0.010941924527287483,
0.03179527446627617,
0.15323057770729065,
0.005643048323690891,
0.006959667429327965,
0.00013757664419244975,
-1.6971631566775613e-06,
0.007552433293312788,
-0.012880627997219563,
-0.02042868547141552,
0.023718440905213356,
0.0022838576696813107,
0.013517720624804497,
-0.02175792306661606,
-0.0009210885618813336,
0.015774134546518326,
-0.015349329449236393,
0.05633925646543503,
0.011824986897408962,
-0.00390510237775743,
-0.01163121871650219,
0.018930673599243164,
-0.028179243206977844,
-0.01438893098384142,
0.04144846647977829,
-0.02175223082304001,
-0.013296201825141907,
-0.027249742299318314,
0.01321756187826395,
0.004708074498921633,
-0.01436836551874876,
-0.00868219044059515,
-0.03285142034292221,
0.00456952303647995,
0.026425881311297417,
0.029904771596193314,
0.0017084190621972084,
-0.03230232000350952,
-0.012617474421858788,
-0.0292427409440279,
-0.0033859421964734793,
-0.039590779691934586,
-0.012841294519603252,
0.008753335103392601,
0.024074239656329155
],
"instruction": [
-0.028294799849390984,
0.011423577554523945,
0.036473676562309265,
0.014384294860064983,
0.033650998026132584,
0.044261567294597626,
0.054745864123106,
-0.006785567384213209,
-0.033210258930921555,
-0.004255346488207579,
-0.009542741812765598,
-0.06505352258682251,
0.020041724666953087,
0.005884387064725161,
0.023006301373243332,
0.009341963566839695,
0.013288628309965134,
0.020596183836460114,
-0.08866936713457108,
0.020365161821246147,
0.0039667654782533646,
-0.009743105620145798,
-0.01038470771163702,
0.03891463950276375,
0.04199279844760895,
-0.0015471188817173243,
0.017346808686852455,
-0.0009712407481856644,
-0.04156488552689552,
-0.08327898383140564,
-0.0057123564183712006,
-0.03304611146450043,
-0.014200099743902683,
-0.05025415122509003,
-0.03943734988570213,
-0.004892001859843731,
-0.03231222182512283,
0.0428633950650692,
-0.014364161528646946,
0.028033988550305367,
0.029457736760377884,
0.04017244279384613,
0.006841784808784723,
-0.040735870599746704,
-0.029939744621515274,
0.012204443104565144,
-0.007098079193383455,
0.00870603322982788,
0.11606526374816895,
-0.036759164184331894,
-0.0226057730615139,
-0.036044806241989136,
0.0036027065943926573,
0.013453883118927479,
0.04608047008514404,
0.023612817749381065,
0.04989304393529892,
0.021902846172451973,
-0.011633052490651608,
0.022141549736261368,
0.0015894151292741299,
0.05768429487943649,
-0.13913792371749878,
0.09128717333078384,
0.05641337111592293,
0.019719669595360756,
-0.0036547910422086716,
0.027990108355879784,
0.02811155840754509,
0.05238833278417587,
-0.051842622458934784,
0.00869784690439701,
0.047691915184259415,
0.08334841579198837,
-0.013117673806846142,
0.0003201559593435377,
8.419524237979203e-05,
-0.047789316624403,
0.007370067294687033,
0.012363560497760773,
-0.0031371808145195246,
0.004051032476127148,
0.03236107900738716,
-0.00014329193800222129,
-0.03795674443244934,
-0.05034990608692169,
0.0012106086360290647,
-0.02104824408888817,
-0.01086430437862873,
-0.0032299929298460484,
-0.03353969752788544,
-0.02411346323788166,
-0.013411669991910458,
-0.008812682703137398,
-0.03013485111296177,
-0.02009817771613598,
0.013921807520091534,
-0.0014325721422210336,
-0.08131148666143417,
0.49421632289886475,
-0.035576775670051575,
0.004619543440639973,
0.04555949568748474,
-0.03806179016828537,
-0.00013336131814867258,
-0.0585651732981205,
-0.004194274544715881,
-0.04407230019569397,
0.0060032126493752,
0.017919957637786865,
0.038701072335243225,
0.006876757834106684,
0.026622561737895012,
-0.04567642882466316,
0.008537987247109413,
0.05237656459212303,
0.058949194848537445,
-0.010129952803254128,
0.006053665652871132,
-0.0360737070441246,
-0.00012246076948940754,
-0.01624143123626709,
0.04624427482485771,
0.016046954318881035,
0.01543157547712326,
-0.07732819765806198,
0.020039336755871773,
0.08234314620494843,
0.024840623140335083,
0.07069077342748642,
0.03989553451538086,
-0.008631136268377304,
-0.04507581144571304,
-0.021784601733088493,
-0.019229508936405182,
-0.0377168208360672,
-0.00907179806381464,
0.007771300617605448,
0.0638672485947609,
-0.037488069385290146,
0.010747137479484081,
-0.0771736204624176,
0.0011695214780047536,
-0.15691794455051422,
0.01227235421538353,
0.08721382170915604,
0.010133261792361736,
0.011185379698872566,
-0.0003329571627546102,
-0.035310111939907074,
0.028933368623256683,
0.01930142007768154,
0.023206880316138268,
-0.021107438951730728,
0.011363566853106022,
0.019187364727258682,
0.061094529926776886,
-0.02340521477162838,
-0.044954899698495865,
0.023971516638994217,
-0.03909078612923622,
-0.02852088212966919,
-0.05408242344856262,
0.02216211147606373,
0.02574894391000271,
-0.061800774186849594,
0.02115439437329769,
-0.014760923571884632,
-0.021337050944566727,
-0.030090905725955963,
0.03165484219789505,
0.010940002277493477,
-0.030419979244470596,
0.016559641808271408,
0.1038859412074089,
0.010971220210194588,
-0.017817728221416473,
0.024410080164670944,
0.048050910234451294,
0.03745235875248909,
0.029996581375598907,
-0.02943653240799904,
-0.03860842064023018,
-0.006290885154157877,
0.021293338388204575,
-0.037811633199453354,
-0.028443869203329086,
-0.02794715203344822,
0.01686839759349823,
0.024009251967072487,
-0.04409133270382881,
-0.034815020859241486,
-0.0571603886783123,
0.006067034788429737,
-0.054615240544080734,
-0.017149915918707848,
-0.053462423384189606,
0.008000087924301624,
-0.0056595285423099995,
0.021827906370162964,
0.010296478867530823,
-0.032147377729415894,
0.01693701185286045,
0.060458943247795105,
0.032488394528627396,
-0.005344670731574297,
0.006128267385065556,
-0.05759930983185768,
0.09644084423780441,
0.04614530876278877,
-0.005628999788314104,
0.044932443648576736,
0.08120793849229813,
-0.004347485024482012,
-0.012239634990692139,
0.002865704009309411,
0.048002783209085464,
0.024727527052164078,
-0.04996514320373535,
0.03530338034033775,
0.06366550922393799,
0.001856555580161512,
-0.026324912905693054,
-0.2594948709011078,
-0.009771975688636303,
0.0008549326448701322,
-0.02988940291106701,
0.05125589296221733,
-0.005975003354251385,
0.04681583121418953,
-0.02462996169924736,
-0.003171533579006791,
0.030998433008790016,
0.04220284894108772,
-0.06249793618917465,
-0.018484387546777725,
-0.025125399231910706,
-0.031116079539060593,
0.03367865830659866,
0.024016913026571274,
0.022654514759778976,
-0.03607874736189842,
-0.008408503606915474,
-0.003392603015527129,
-0.003394088940694928,
0.027715187519788742,
-0.04477202147245407,
0.0698717013001442,
-0.025311101227998734,
0.19767506420612335,
0.03268478438258171,
-0.020445292815566063,
-0.001494147116318345,
0.021007763221859932,
-0.014887446537613869,
-0.009454205632209778,
-0.1457635909318924,
0.044506173580884933,
0.036704450845718384,
-0.009630566462874413,
-0.033140044659376144,
-0.06801245361566544,
-0.04351450130343437,
-0.0015428439946845174,
0.040272679179906845,
-0.029273075982928276,
-0.04262491688132286,
-0.08079501986503601,
-0.09341204911470413,
-0.008734806440770626,
0.022583896294236183,
-0.00786470714956522,
-0.01965724304318428,
0.03771577775478363,
0.02482139877974987,
-0.00796583667397499,
-0.021494679152965546,
0.0003737947263289243,
-0.005912111606448889,
-0.08274369686841965,
0.01637021079659462,
-0.061421725898981094,
0.004829832818359137,
-0.038608115166425705,
-0.031537361443042755,
0.03917916864156723,
-0.005888978950679302,
0.042416442185640335,
-0.03242899104952812,
-0.0019902149215340614,
-0.008614087477326393,
0.02787523716688156,
-0.0027530998922884464,
-0.01158168725669384,
-0.03733321651816368,
-0.022949688136577606,
-0.04117552191019058,
0.029149919748306274,
0.008193145506083965,
-0.01992698200047016,
-0.009709509089589119,
-0.0008522608550265431,
-0.02838626690208912,
-0.0007495849276892841,
-0.024111248552799225,
0.043145451694726944,
0.026467004790902138,
0.030171573162078857,
-0.027401920408010483,
0.012965810485184193,
-0.0112138157710433,
0.05209830775856972,
0.0041970144957304,
-0.01941954903304577,
0.024340109899640083,
-0.015719756484031677,
-0.040586892515420914,
0.03577498346567154,
-0.01582811400294304,
-0.32217153906822205,
-0.01248727086931467,
-0.031408052891492844,
0.04412638023495674,
-0.01473877765238285,
0.027508625760674477,
0.0105436434969306,
0.021679112687706947,
-0.059311412274837494,
0.030258214101195335,
0.029154274612665176,
0.02299693413078785,
0.012399664148688316,
-0.03594256937503815,
0.01200439315289259,
0.01518949493765831,
0.04612886905670166,
-0.012303836643695831,
0.01768031343817711,
-0.028339799493551254,
0.03841863572597504,
0.015603763051331043,
0.17313213646411896,
0.00765720522031188,
0.025147512555122375,
0.06211095303297043,
0.020088262856006622,
0.036083098500967026,
-0.02794945240020752,
0.01467492152005434,
0.03732845187187195,
0.0015544397756457329,
0.03362180292606354,
0.0004458639014046639,
0.05182427167892456,
0.014795232564210892,
-0.049910202622413635,
-0.017119428142905235,
-0.006229814607650042,
0.006820270791649818,
0.004686770029366016,
-0.019537806510925293,
-0.006466642487794161,
-0.014810604974627495,
0.0433243028819561,
-0.087904192507267,
-0.018105102702975273,
-0.006181234493851662,
0.009212393313646317,
-0.025008780881762505,
-0.04318060353398323,
0.009955020621418953,
-1.417224029864883e-05,
-0.0017516168300062418,
0.0240207239985466,
0.004205706063657999,
-0.015509650111198425,
-0.014545624144375324,
-0.007549952249974012,
-0.002600432839244604,
-0.003538699122145772,
-0.0565103143453598,
-0.023127390071749687,
0.03355006128549576,
0.02841961942613125
],
"output": [
-0.0176240187138319,
-0.004716125782579184,
0.006949670612812042,
-0.019411537796258926,
0.01567736640572548,
0.03834877535700798,
0.029715606942772865,
-0.005319108720868826,
-0.024479160085320473,
0.01141896191984415,
-0.03195776045322418,
-0.044211603701114655,
0.015640120953321457,
0.00799254234880209,
0.04004312679171562,
-0.03116476722061634,
0.01872282661497593,
0.007150548510253429,
-0.044442109763622284,
0.023735281080007553,
-0.0381082259118557,
0.006544189061969519,
-0.005491169635206461,
0.04239990934729576,
0.0483052060008049,
-0.003837290685623884,
0.004100733436644077,
-0.0035680716391652822,
-0.04005947709083557,
-0.1662059724330902,
-0.021250028163194656,
-0.017237992957234383,
0.027853040024638176,
-0.010291568003594875,
-0.03665821626782417,
-0.019267022609710693,
-0.05113506689667702,
0.021914202719926834,
-0.0053786118514835835,
0.052501581609249115,
0.05814570561051369,
0.010980119928717613,
0.011934413574635983,
-0.03319042921066284,
0.024707715958356857,
0.03283394128084183,
-0.027020296081900597,
0.015744253993034363,
0.1453917771577835,
-0.0009102760814130306,
0.0257569532841444,
-0.009953656233847141,
0.03505204617977142,
0.013196351006627083,
0.013851557858288288,
0.01405735220760107,
0.04858440160751343,
0.043133508414030075,
0.025982405990362167,
0.024609744548797607,
0.03449631854891777,
0.06881494075059891,
-0.14141228795051575,
0.07843177020549774,
0.006140443496406078,
0.011221753433346748,
-0.03221449255943298,
0.003841140540316701,
0.04540587589144707,
0.07630753517150879,
-0.04095779359340668,
0.032072145491838455,
0.034730445593595505,
0.06508953124284744,
-0.010172650218009949,
-0.022360237315297127,
0.029742486774921417,
-0.05383429676294327,
0.021864868700504303,
0.026357246562838554,
0.0022341941948980093,
0.005670070648193359,
0.027428705245256424,
0.01558087021112442,
-0.022692600265145302,
-0.017826268449425697,
-0.017297543585300446,
-0.04001662880182266,
-0.013342439197003841,
0.005840242374688387,
-0.030979914590716362,
-0.01724572479724884,
-0.024048786610364914,
0.0007530910661444068,
-0.04041028767824173,
-0.013380403630435467,
-0.020868824794888496,
0.0014694909332320094,
-0.08467952907085419,
0.580690324306488,
-0.004032657481729984,
0.05849768966436386,
0.004397968295961618,
0.010408908128738403,
-0.007641632575541735,
-0.02606048434972763,
-0.010855214670300484,
0.042734257876873016,
-0.011239683255553246,
0.019521446898579597,
0.0353398434817791,
0.03371568024158478,
0.023315098136663437,
-0.04357689991593361,
0.054250311106443405,
0.024925105273723602,
0.03813351318240166,
0.009227410890161991,
-0.01181033905595541,
-0.02605658583343029,
-0.0014186418848112226,
-0.01716744527220726,
0.060392748564481735,
-0.031001577153801918,
0.022377919405698776,
-0.052311576902866364,
0.02619396522641182,
0.06624552607536316,
0.033756088465452194,
0.02069525606930256,
0.048069678246974945,
0.020734800025820732,
-0.045360464602708817,
0.00034473047708161175,
-0.003251194953918457,
0.0029696219135075808,
-0.024894757196307182,
-0.040147315710783005,
0.03724127262830734,
-0.013771955855190754,
-0.01938561722636223,
-0.0677570104598999,
0.04945305734872818,
-0.13771581649780273,
-0.014849649742245674,
0.08132605254650116,
0.03424401208758354,
0.047011204063892365,
-0.029743820428848267,
-0.017329024150967598,
0.0065647610463202,
-0.004912176169455051,
-0.0041047860868275166,
-0.001819611992686987,
0.003893696703016758,
0.04609661549329758,
0.07918432354927063,
-0.005971760023385286,
-0.009723413735628128,
-0.03991689160466194,
-0.055766761302948,
0.0011914591304957867,
-0.04016687721014023,
0.014506371691823006,
0.03450341150164604,
-0.028972936794161797,
-0.019107777625322342,
-0.02772853896021843,
-0.005077740643173456,
-0.050147395581007004,
0.04929664358496666,
-0.006028413772583008,
-0.027990680187940598,
0.0063177370466291904,
0.09959689527750015,
0.0040030027739703655,
0.008515025489032269,
0.01651126518845558,
0.007028188090771437,
-0.0040906500071287155,
0.061634790152311325,
-0.06352051347494125,
-0.024248389527201653,
-0.015383181162178516,
0.026543831452727318,
-0.04806214198470116,
-0.02041323482990265,
-0.006135308649390936,
0.017517777159810066,
0.01024091336876154,
-0.0396764799952507,
0.00032474592444486916,
-0.07767457515001297,
-0.03382018581032753,
-0.050721388310194016,
-0.04525049775838852,
-0.046743810176849365,
-0.0227361973375082,
0.0029768687672913074,
-0.008010873571038246,
0.04743870720267296,
-0.0592835508286953,
-0.000213865379919298,
0.013153512962162495,
-0.0027194737922400236,
0.040022753179073334,
-0.003667823737487197,
-0.03996644914150238,
0.10362017899751663,
0.040296103805303574,
0.006855533923953772,
0.034558698534965515,
0.09931717813014984,
0.01592644862830639,
-0.04111484810709953,
0.03868898004293442,
0.025801660493016243,
0.02600902132689953,
0.020218197256326675,
0.03573831170797348,
0.06855560839176178,
-0.024668289348483086,
-0.05738578736782074,
-0.20787787437438965,
-0.02188975177705288,
0.00024557230062782764,
-0.016545936465263367,
0.03627012297511101,
-0.0286291241645813,
0.021834582090377808,
-0.051027849316596985,
-0.02713022194802761,
0.03910679742693901,
0.027725083753466606,
-0.03944699466228485,
-0.0265720933675766,
0.010004647076129913,
-0.044192299246788025,
0.0589180663228035,
0.017603101208806038,
0.011887047439813614,
-0.004177481401711702,
-0.017357105389237404,
0.022585542872548103,
-0.04550827294588089,
-0.005166847724467516,
-0.03875027596950531,
0.04337921366095543,
-0.02990945801138878,
0.1599227786064148,
0.007562293205410242,
0.01887401193380356,
0.0034982352517545223,
0.038471926003694534,
-0.01028207316994667,
-0.01702789030969143,
-0.1373232752084732,
0.025452803820371628,
-0.004819933325052261,
-0.022516997531056404,
-0.039742611348629,
-0.0214276984333992,
0.004858740605413914,
-0.02375420369207859,
0.008609913289546967,
-0.048986759036779404,
-0.008950253948569298,
-0.08449315279722214,
-0.026036735624074936,
-0.011276411823928356,
0.019190000370144844,
0.022633947432041168,
-0.040058594197034836,
0.04388629272580147,
0.031580597162246704,
0.01809006556868553,
-0.0030303390230983496,
-0.046913061290979385,
-0.005649161525070667,
-0.08374624699354172,
-0.0017140369163826108,
-0.03677505627274513,
0.0037890963722020388,
-0.04737789183855057,
-0.03700713813304901,
-0.003818045137450099,
0.009253373369574547,
0.05104338750243187,
0.002676995936781168,
-0.007246305234730244,
-0.02757888101041317,
-0.004555205814540386,
0.009329759515821934,
-0.03360270708799362,
0.0012659220956265926,
-0.04205852746963501,
-0.013427188619971275,
0.05246889963746071,
0.01665373146533966,
-0.0010568611323833466,
-0.009995844215154648,
0.001530607114546001,
-0.03711310774087906,
-0.013399848714470863,
-0.044277820736169815,
0.029894111678004265,
0.053477853536605835,
0.0006499383598566055,
-0.025698142126202583,
0.017410481348633766,
-0.02152882143855095,
0.03427017107605934,
-0.003780169179663062,
0.01745775155723095,
-0.019152434542775154,
-0.032994236797094345,
-0.04189195856451988,
0.0359673835337162,
-0.012284714728593826,
-0.2722877562046051,
0.01217364240437746,
-0.034562528133392334,
0.04906000941991806,
-0.022975552827119827,
0.016820218414068222,
0.035448189824819565,
0.003748809453099966,
-0.06079091504216194,
0.041749920696020126,
0.025426866486668587,
0.005696199368685484,
0.016359742730855942,
-0.013725521042943,
0.012083960697054863,
-0.0012312465114519,
0.015471730381250381,
-0.011446741409599781,
0.025711320340633392,
-0.01745632477104664,
0.003108542412519455,
-0.007072076201438904,
0.1382468193769455,
-0.019911963492631912,
0.009592660702764988,
0.05282985046505928,
0.030706806108355522,
0.00796398427337408,
-0.04627992957830429,
-0.009330281056463718,
0.010543725453317165,
-0.003632632317021489,
0.03128711134195328,
0.007868845015764236,
-0.010772887617349625,
-0.002463238313794136,
0.006074932403862476,
0.0006607716786675155,
0.018854515627026558,
-0.033307116478681564,
0.011923067271709442,
-0.018863312900066376,
-0.002774772234261036,
-0.0019064220832660794,
0.01827140338718891,
-0.08101121336221695,
-0.021098297089338303,
-0.02879447676241398,
-0.0014344346709549427,
-0.03157539293169975,
-0.03340595215559006,
-0.017963532358407974,
0.002487666206434369,
0.004737753886729479,
0.009413175284862518,
-0.004956235643476248,
-0.018281303346157074,
0.01808854751288891,
0.007452634163200855,
-0.0066879065707325935,
0.004095880780369043,
-0.05039187893271446,
-0.04649675264954567,
0.04778149351477623,
0.023173490539193153
],
"testing": null
}
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **instruction** is of type `text`.
* (optional) **input** is of type `text`.
* **output** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **quality** is of type `rating` with the following allowed values [1, 2, 3, 4, 5].
* **explanation** is of type `text`.
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **quality-suggestion** is of type `rating` with the following allowed values [1, 2, 3, 4, 5].
* (optional) **explanation-suggestion** is of type `text`.
* **✨ NEW** **Vectors**: As of Argilla 1.19.0, the vectors have been included in order to add support for similarity search to explore similar records based on vector search powered by the search engine defined. The vectors are optional and cannot be seen within the UI, those are uploaded and internally used. Also the vectors will always be optional, and only the dimensions previously defined in their settings.
* (optional) **input** is of type `float32` and has a dimension of (1, `384`).
* (optional) **instruction** is of type `float32` and has a dimension of (1, `384`).
* (optional) **output** is of type `float32` and has a dimension of (1, `384`).
* (optional) **testing** is of type `float32` and has a dimension of (1, `1`).
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
irds/beir_quora | ---
pretty_name: '`beir/quora`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `beir/quora`
The `beir/quora` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/beir#beir/quora).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=522,931
- `queries` (i.e., topics); count=15,000
This dataset is used by: [`beir_quora_dev`](https://huggingface.co/datasets/irds/beir_quora_dev), [`beir_quora_test`](https://huggingface.co/datasets/irds/beir_quora_test)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/beir_quora', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
queries = load_dataset('irds/beir_quora', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Thakur2021Beir,
title = "BEIR: A Heterogenous Benchmark for Zero-shot Evaluation of Information Retrieval Models",
author = "Thakur, Nandan and Reimers, Nils and Rücklé, Andreas and Srivastava, Abhishek and Gurevych, Iryna",
journal= "arXiv preprint arXiv:2104.08663",
month = "4",
year = "2021",
url = "https://arxiv.org/abs/2104.08663",
}
```
|
deepghs/nsfw_detect | ---
license: mit
tags:
- art
size_categories:
- 10K<n<100K
---
The dataset used for training the NSFW Detect classification model is divided into five categories: `drawing`, `hentai`, `neutral`, `porn`, and `sexy`, following the format mentioned in [GantMan/nsfw_model](https://github.com/GantMan/nsfw_model) and [yangbisheng2009/nsfw-resnet](https://github.com/yangbisheng2009/nsfw-resnet). |
pribadihcr/cefr-cep-up-down-same-ABS-train | ---
dataset_info:
features:
- name: number
dtype: int64
- name: messages
sequence: string
splits:
- name: train
num_bytes: 501977650
num_examples: 3090707
- name: test
num_bytes: 62713681
num_examples: 386337
download_size: 326242105
dataset_size: 564691331
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
unanam/mdrama_dog | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcripts
dtype: string
splits:
- name: train
num_bytes: 4673179027.482265
num_examples: 5146
- name: test
num_bytes: 593177419.5930359
num_examples: 644
- name: valid
num_bytes: 585191953.8886989
num_examples: 643
download_size: 2283219574
dataset_size: 5851548400.964
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
swaption2009/20k-en-zh-translation-pinyin-hsk | ---
task_categories:
- translation
language:
- en
- zh
---
# 20,000+ chinese sentences with translations and pinyin
- Source: https://mnemosyne-proj.org/cards/20000-chinese-sentences-translations-and-pinyin
- Contributed by: Brian Vaughan http://brianvaughan.net/
# Dataset Structure
Each sample consists of:
1. English sentence
2. HSK level
3. Chinese translation
4. Pinyin
5. separator ("\-\-")
# Other Info from the Source
### HSK level
All of the sentences came from sample sentences intended to describe a
particular word. HSK level (in the category name) signifies the HSK
level of the word this sentence describes. Note that "HSK level" is
1-4.
### Limitation
This is a search of all characters in each level, including the
characters that loner words are composed of. This is why even HSK
level 4 sentences can contain sentences in "limited 1."
For example, 作主 (zuo4zhu3) is an HSK level 4 word. It contains 2
characters which both appear in other HSK level 1 words, and so the
sample sentence for 作主 (assuming that sentence contains no other
difficult words) might appear in the category "HSK 4; limited 1;"
|
laskinaa/WikiCCC | ---
license: agpl-3.0
task_categories:
- text-classification
language:
- en
- fr
- de
- ru
- sv
tags:
- webdataset
- wikipedia
---
# Dataset Card for WikiCCC
WikiCCC is a set of Wikipedia-based clustered (labeled) comparable corpus for clustering comparable corpora.
A detailed description can be found in the paper _Creating Clustered Comparable Corpora from Wikipedia with Different Fuzziness Levels and Language Representativity_ by Anna Laskina, Eric Gaussier, Gaelle Calvary and accepted at the 17th Workshop on Building and Using Comparable Corpora (BUCC 2024).
|
samitizerxu/algae-wirs | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '1'
'1': '2'
'2': '3'
'3': '4'
'4': '5'
'5': test
splits:
- name: train
num_bytes: 33936156.629999995
num_examples: 17035
- name: test
num_bytes: 12474396.284
num_examples: 6494
download_size: 45458394
dataset_size: 46410552.914
---
# Dataset Card for "algae-wirs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
metamath/codeparrot-ds-tokenized-128 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 8618263476
num_examples: 16702061
- name: valid
num_bytes: 48072624
num_examples: 93164
download_size: 3804670335
dataset_size: 8666336100
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
---
- `transformersbook/codeparrot-train` 데이터 셋에서 Data Science관련 코드만 추출하고 `huggingface-course/code-search-net-tokenizer` 를 사용해 토큰화를 마친 python 코드 데이터셋
- A python code dataset extracting only Data Science related code from the `transformersbook/codeparrot-train` dataset and tokenized using `huggingface-course/code-search-net-tokenizer`. |
FourthBrainGenAI/AI-Superstar-Dataset | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 44747
num_examples: 148
download_size: 23888
dataset_size: 44747
---
# Dataset Card for "AI-Superstar-Dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KETI-AIR/kor_quail | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: challenge
path: data/challenge-*
dataset_info:
features:
- name: data_index_by_user
dtype: int32
- name: id
dtype: string
- name: context_id
dtype: string
- name: question_id
dtype: string
- name: domain
dtype: string
- name: metadata
struct:
- name: author
dtype: string
- name: title
dtype: string
- name: url
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: question_type
dtype: string
- name: answers
sequence: string
- name: correct_answer_id
dtype: int32
splits:
- name: train
num_bytes: 27612173
num_examples: 10246
- name: validation
num_bytes: 5860893
num_examples: 2164
- name: challenge
num_bytes: 1451663
num_examples: 556
download_size: 2671154
dataset_size: 34924729
license: cc-by-nc-sa-4.0
---
# Dataset Card for "kor_quail"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
# Source Data Citation Information
```
@inproceedings{DBLP:conf/aaai/RogersKDR20,
author = {Anna Rogers and
Olga Kovaleva and
Matthew Downey and
Anna Rumshisky},
title = {Getting Closer to {AI} Complete Question Answering: {A} Set of Prerequisite
Real Tasks},
booktitle = {The Thirty-Fourth {AAAI} Conference on Artificial Intelligence, {AAAI}
2020, The Thirty-Second Innovative Applications of Artificial Intelligence
Conference, {IAAI} 2020, The Tenth {AAAI} Symposium on Educational
Advances in Artificial Intelligence, {EAAI} 2020, New York, NY, USA,
February 7-12, 2020},
pages = {8722--8731},
publisher = {{AAAI} Press},
year = {2020},
url = {https://aaai.org/ojs/index.php/AAAI/article/view/6398},
timestamp = {Thu, 04 Jun 2020 13:18:48 +0200},
biburl = {https://dblp.org/rec/conf/aaai/RogersKDR20.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` |
flaviolima/coringaaa.zip | ---
license: openrail
---
|
liuyanchen1015/MULTI_VALUE_qqp_possessives_belong | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1810077
num_examples: 9983
- name: test
num_bytes: 18044755
num_examples: 98443
- name: train
num_bytes: 16625757
num_examples: 91164
download_size: 21769659
dataset_size: 36480589
---
# Dataset Card for "MULTI_VALUE_qqp_possessives_belong"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hugfaceguy0001/Novels | ---
dataset_info:
features:
- name: type
dtype: string
- name: title
dtype: string
- name: author
dtype: string
- name: intro
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 5555493738
num_examples: 10893
download_size: 3541005263
dataset_size: 5555493738
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
JuneKo/bookCover_sciFi_child_com_reli_marvel | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 4687644.0
num_examples: 100
download_size: 4639289
dataset_size: 4687644.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "bookCover_sciFi_child_com_reli_marvel"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
theblackcat102/llava-instruct-mix | ---
dataset_info:
features:
- name: image
dtype: image
- name: conversations
dtype: string
splits:
- name: train
num_bytes: 46019106088.205
num_examples: 272795
download_size: 20289135489
dataset_size: 46019106088.205
task_categories:
- visual-question-answering
language:
- en
tags:
- multimodal
- vision
size_categories:
- 100K<n<1M
license: cc-by-nc-4.0
---
# LLaVA Instruct Mix
Added OCR and Chart QA dataset into this for more text extraction questions
|
gmaijoe-emailchaser/emailchaser-llm-body-data-v0.0.1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 611717
num_examples: 404
download_size: 162582
dataset_size: 611717
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "emailchaser-llm-body-data-v0.0.1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/casablanca_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of casablanca/カサブランカ/卡萨布兰卡 (Azur Lane)
This is the dataset of casablanca/カサブランカ/卡萨布兰卡 (Azur Lane), containing 46 images and their tags.
The core tags of this character are `long_hair, breasts, purple_eyes, grey_hair, large_breasts, ponytail, hair_bow, bow, bangs, hair_between_eyes, sidelocks, very_long_hair, braid`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 46 | 77.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/casablanca_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 46 | 38.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/casablanca_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 116 | 83.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/casablanca_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 46 | 66.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/casablanca_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 116 | 128.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/casablanca_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/casablanca_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, bare_shoulders, black_gloves, blue_skirt, looking_at_viewer, midriff, navel, red_necktie, solo, closed_mouth, coat, crop_top, jacket, miniskirt, off_shoulder, pantyhose, pleated_skirt, simple_background, sleeveless_shirt, white_background, white_shirt, armpits, blonde_hair, flight_deck, floating_hair, full_body, hairclip, standing, stomach, thigh_boots |
| 1 | 34 |  |  |  |  |  | 1girl, cheerleader, armpit_cutout, looking_at_viewer, solo, covered_navel, miniskirt, pleated_skirt, blush, white_skirt, crop_top, leotard_under_clothes, cleavage_cutout, long_sleeves, black_leotard, blue_thighhighs, holding, ribbed_legwear, pom_pom_(cheerleading), open_mouth, two-tone_skirt, standing, smile, sweat, midriff, cowboy_shot, groin |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_gloves | blue_skirt | looking_at_viewer | midriff | navel | red_necktie | solo | closed_mouth | coat | crop_top | jacket | miniskirt | off_shoulder | pantyhose | pleated_skirt | simple_background | sleeveless_shirt | white_background | white_shirt | armpits | blonde_hair | flight_deck | floating_hair | full_body | hairclip | standing | stomach | thigh_boots | cheerleader | armpit_cutout | covered_navel | blush | white_skirt | leotard_under_clothes | cleavage_cutout | long_sleeves | black_leotard | blue_thighhighs | holding | ribbed_legwear | pom_pom_(cheerleading) | open_mouth | two-tone_skirt | smile | sweat | cowboy_shot | groin |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:---------------|:-------------|:--------------------|:----------|:--------|:--------------|:-------|:---------------|:-------|:-----------|:---------|:------------|:---------------|:------------|:----------------|:--------------------|:-------------------|:-------------------|:--------------|:----------|:--------------|:--------------|:----------------|:------------|:-----------|:-----------|:----------|:--------------|:--------------|:----------------|:----------------|:--------|:--------------|:------------------------|:------------------|:---------------|:----------------|:------------------|:----------|:-----------------|:-------------------------|:-------------|:-----------------|:--------|:--------|:--------------|:--------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 1 | 34 |  |  |  |  |  | X | | | | X | X | | | X | | | X | | X | | | X | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
emilykang/anatomy_train | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 811474665.5
num_examples: 1500
download_size: 779469107
dataset_size: 811474665.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-126m | ---
pretty_name: Evaluation run of AI-Sweden-Models/gpt-sw3-126m
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AI-Sweden-Models/gpt-sw3-126m](https://huggingface.co/AI-Sweden-Models/gpt-sw3-126m)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-126m\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-04T14:50:03.394382](https://huggingface.co/datasets/open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-126m/blob/main/results_2024-01-04T14-50-03.394382.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24514074736530633,\n\
\ \"acc_stderr\": 0.030375707776311822,\n \"acc_norm\": 0.24572511855617835,\n\
\ \"acc_norm_stderr\": 0.03116726699554371,\n \"mc1\": 0.25091799265605874,\n\
\ \"mc1_stderr\": 0.015176985027707693,\n \"mc2\": 0.4406746017669096,\n\
\ \"mc2_stderr\": 0.015032743284114658\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.1885665529010239,\n \"acc_stderr\": 0.011430897647675803,\n\
\ \"acc_norm\": 0.22013651877133106,\n \"acc_norm_stderr\": 0.01210812488346098\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2778331009759012,\n\
\ \"acc_stderr\": 0.004470152081675126,\n \"acc_norm\": 0.29555865365465045,\n\
\ \"acc_norm_stderr\": 0.004553609405747218\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2,\n \
\ \"acc_stderr\": 0.03455473702325438,\n \"acc_norm\": 0.2,\n \"\
acc_norm_stderr\": 0.03455473702325438\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.23018867924528302,\n \"acc_stderr\": 0.02590789712240817,\n\
\ \"acc_norm\": 0.23018867924528302,\n \"acc_norm_stderr\": 0.02590789712240817\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.20833333333333334,\n\
\ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.20833333333333334,\n\
\ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\
\ \"acc_stderr\": 0.030631145539198813,\n \"acc_norm\": 0.2023121387283237,\n\
\ \"acc_norm_stderr\": 0.030631145539198813\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.22,\n \"acc_stderr\": 0.041633319989322716,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.041633319989322716\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23544973544973544,\n \"acc_stderr\": 0.021851509822031715,\n \"\
acc_norm\": 0.23544973544973544,\n \"acc_norm_stderr\": 0.021851509822031715\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.038095238095238106,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.038095238095238106\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2806451612903226,\n \"acc_stderr\": 0.02556060472102289,\n \"\
acc_norm\": 0.2806451612903226,\n \"acc_norm_stderr\": 0.02556060472102289\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2660098522167488,\n \"acc_stderr\": 0.031089826002937523,\n \"\
acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.031089826002937523\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\"\
: 0.19,\n \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.033464098810559534,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.033464098810559534\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.18686868686868688,\n \"acc_stderr\": 0.027772533334218977,\n \"\
acc_norm\": 0.18686868686868688,\n \"acc_norm_stderr\": 0.027772533334218977\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.24352331606217617,\n \"acc_stderr\": 0.030975436386845443,\n\
\ \"acc_norm\": 0.24352331606217617,\n \"acc_norm_stderr\": 0.030975436386845443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.258974358974359,\n \"acc_stderr\": 0.02221110681006166,\n \
\ \"acc_norm\": 0.258974358974359,\n \"acc_norm_stderr\": 0.02221110681006166\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275777,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275777\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24369747899159663,\n \"acc_stderr\": 0.02788682807838056,\n\
\ \"acc_norm\": 0.24369747899159663,\n \"acc_norm_stderr\": 0.02788682807838056\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21834862385321102,\n \"acc_stderr\": 0.017712600528722717,\n \"\
acc_norm\": 0.21834862385321102,\n \"acc_norm_stderr\": 0.017712600528722717\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3425925925925926,\n \"acc_stderr\": 0.032365852526021574,\n \"\
acc_norm\": 0.3425925925925926,\n \"acc_norm_stderr\": 0.032365852526021574\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501947,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501947\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.20253164556962025,\n \"acc_stderr\": 0.026160568246601453,\n \
\ \"acc_norm\": 0.20253164556962025,\n \"acc_norm_stderr\": 0.026160568246601453\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.033220157957767414,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.033220157957767414\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.0432704093257873,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.0432704093257873\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n\
\ \"acc_stderr\": 0.028911208802749475,\n \"acc_norm\": 0.26495726495726496,\n\
\ \"acc_norm_stderr\": 0.028911208802749475\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23627075351213284,\n\
\ \"acc_stderr\": 0.0151904737170375,\n \"acc_norm\": 0.23627075351213284,\n\
\ \"acc_norm_stderr\": 0.0151904737170375\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.02355083135199509,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.02355083135199509\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.22695035460992907,\n \"acc_stderr\": 0.024987106365642973,\n \"\
acc_norm\": 0.22695035460992907,\n \"acc_norm_stderr\": 0.024987106365642973\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24315514993481094,\n\
\ \"acc_stderr\": 0.010956556654417367,\n \"acc_norm\": 0.24315514993481094,\n\
\ \"acc_norm_stderr\": 0.010956556654417367\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.03018753206032938,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.03018753206032938\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25326797385620914,\n \"acc_stderr\": 0.017593486895366835,\n \
\ \"acc_norm\": 0.25326797385620914,\n \"acc_norm_stderr\": 0.017593486895366835\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.21224489795918366,\n \"acc_stderr\": 0.026176967197866767,\n\
\ \"acc_norm\": 0.21224489795918366,\n \"acc_norm_stderr\": 0.026176967197866767\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.27710843373493976,\n\
\ \"acc_stderr\": 0.034843315926805875,\n \"acc_norm\": 0.27710843373493976,\n\
\ \"acc_norm_stderr\": 0.034843315926805875\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.03565079670708311,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.03565079670708311\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25091799265605874,\n\
\ \"mc1_stderr\": 0.015176985027707693,\n \"mc2\": 0.4406746017669096,\n\
\ \"mc2_stderr\": 0.015032743284114658\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5043409629044988,\n \"acc_stderr\": 0.0140519560640769\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \
\ \"acc_stderr\": 0.0007581501137225266\n }\n}\n```"
repo_url: https://huggingface.co/AI-Sweden-Models/gpt-sw3-126m
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|arc:challenge|25_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|arc:challenge|25_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|gsm8k|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|gsm8k|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hellaswag|10_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hellaswag|10_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-06T17-28-06.762179.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-50-03.394382.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T14-50-03.394382.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- '**/details_harness|winogrande|5_2023-12-06T17-28-06.762179.parquet'
- split: 2024_01_04T14_50_03.394382
path:
- '**/details_harness|winogrande|5_2024-01-04T14-50-03.394382.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-04T14-50-03.394382.parquet'
- config_name: results
data_files:
- split: 2023_12_06T17_28_06.762179
path:
- results_2023-12-06T17-28-06.762179.parquet
- split: 2024_01_04T14_50_03.394382
path:
- results_2024-01-04T14-50-03.394382.parquet
- split: latest
path:
- results_2024-01-04T14-50-03.394382.parquet
---
# Dataset Card for Evaluation run of AI-Sweden-Models/gpt-sw3-126m
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AI-Sweden-Models/gpt-sw3-126m](https://huggingface.co/AI-Sweden-Models/gpt-sw3-126m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-126m",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-04T14:50:03.394382](https://huggingface.co/datasets/open-llm-leaderboard/details_AI-Sweden-Models__gpt-sw3-126m/blob/main/results_2024-01-04T14-50-03.394382.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24514074736530633,
"acc_stderr": 0.030375707776311822,
"acc_norm": 0.24572511855617835,
"acc_norm_stderr": 0.03116726699554371,
"mc1": 0.25091799265605874,
"mc1_stderr": 0.015176985027707693,
"mc2": 0.4406746017669096,
"mc2_stderr": 0.015032743284114658
},
"harness|arc:challenge|25": {
"acc": 0.1885665529010239,
"acc_stderr": 0.011430897647675803,
"acc_norm": 0.22013651877133106,
"acc_norm_stderr": 0.01210812488346098
},
"harness|hellaswag|10": {
"acc": 0.2778331009759012,
"acc_stderr": 0.004470152081675126,
"acc_norm": 0.29555865365465045,
"acc_norm_stderr": 0.004553609405747218
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2,
"acc_stderr": 0.03455473702325438,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03455473702325438
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23018867924528302,
"acc_stderr": 0.02590789712240817,
"acc_norm": 0.23018867924528302,
"acc_norm_stderr": 0.02590789712240817
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.030631145539198813,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.030631145539198813
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322716,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322716
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23544973544973544,
"acc_stderr": 0.021851509822031715,
"acc_norm": 0.23544973544973544,
"acc_norm_stderr": 0.021851509822031715
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.038095238095238106,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.038095238095238106
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2806451612903226,
"acc_stderr": 0.02556060472102289,
"acc_norm": 0.2806451612903226,
"acc_norm_stderr": 0.02556060472102289
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.031089826002937523,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.031089826002937523
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.033464098810559534,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.033464098810559534
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.18686868686868688,
"acc_stderr": 0.027772533334218977,
"acc_norm": 0.18686868686868688,
"acc_norm_stderr": 0.027772533334218977
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24352331606217617,
"acc_stderr": 0.030975436386845443,
"acc_norm": 0.24352331606217617,
"acc_norm_stderr": 0.030975436386845443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.258974358974359,
"acc_stderr": 0.02221110681006166,
"acc_norm": 0.258974358974359,
"acc_norm_stderr": 0.02221110681006166
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275777,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275777
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24369747899159663,
"acc_stderr": 0.02788682807838056,
"acc_norm": 0.24369747899159663,
"acc_norm_stderr": 0.02788682807838056
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21834862385321102,
"acc_stderr": 0.017712600528722717,
"acc_norm": 0.21834862385321102,
"acc_norm_stderr": 0.017712600528722717
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3425925925925926,
"acc_stderr": 0.032365852526021574,
"acc_norm": 0.3425925925925926,
"acc_norm_stderr": 0.032365852526021574
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.20253164556962025,
"acc_stderr": 0.026160568246601453,
"acc_norm": 0.20253164556962025,
"acc_norm_stderr": 0.026160568246601453
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.033220157957767414,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.033220157957767414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.0432704093257873,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.0432704093257873
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.028911208802749475,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.028911208802749475
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23627075351213284,
"acc_stderr": 0.0151904737170375,
"acc_norm": 0.23627075351213284,
"acc_norm_stderr": 0.0151904737170375
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.02355083135199509,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.02355083135199509
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.25,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22695035460992907,
"acc_stderr": 0.024987106365642973,
"acc_norm": 0.22695035460992907,
"acc_norm_stderr": 0.024987106365642973
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24315514993481094,
"acc_stderr": 0.010956556654417367,
"acc_norm": 0.24315514993481094,
"acc_norm_stderr": 0.010956556654417367
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.03018753206032938,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.03018753206032938
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25326797385620914,
"acc_stderr": 0.017593486895366835,
"acc_norm": 0.25326797385620914,
"acc_norm_stderr": 0.017593486895366835
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.21224489795918366,
"acc_stderr": 0.026176967197866767,
"acc_norm": 0.21224489795918366,
"acc_norm_stderr": 0.026176967197866767
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-virology|5": {
"acc": 0.27710843373493976,
"acc_stderr": 0.034843315926805875,
"acc_norm": 0.27710843373493976,
"acc_norm_stderr": 0.034843315926805875
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.03565079670708311,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.03565079670708311
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25091799265605874,
"mc1_stderr": 0.015176985027707693,
"mc2": 0.4406746017669096,
"mc2_stderr": 0.015032743284114658
},
"harness|winogrande|5": {
"acc": 0.5043409629044988,
"acc_stderr": 0.0140519560640769
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225266
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
LiangMen/jaychou | ---
license: other
---
|
Recife/Datasets | ---
license: cc0-1.0
---
|
benayas/banking_artificial_20pct_v2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1036296
num_examples: 10003
download_size: 325179
dataset_size: 1036296
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Yura32000/eurosat_enrichments | ---
dataset_info:
features:
- name: embedding
sequence: float32
splits:
- name: test
num_bytes: 8305200
num_examples: 2700
download_size: 10168796
dataset_size: 8305200
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
Alljoined/14_70 | ---
dataset_info:
features:
- name: EEG
sequence:
sequence: float64
- name: image
dtype: image
- name: subject_id
dtype: int32
- name: session
dtype: int32
- name: block
dtype: int32
- name: trial
dtype: int32
- name: 73k_id
dtype: int32
- name: coco_id
dtype: int32
- name: curr_time
dtype: float32
splits:
- name: train
num_bytes: 11257498991.25
num_examples: 42118
download_size: 9007735613
dataset_size: 11257498991.25
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
cleanrl/summarize_from_feedback_tldr_3_filtered_oai_preprocessing_1704321749 | ---
dataset_info:
features:
- name: id
dtype: string
- name: subreddit
dtype: string
- name: title
dtype: string
- name: post
dtype: string
- name: summary
dtype: string
- name: query_token
sequence: int64
- name: query
dtype: string
- name: reference_response
dtype: string
- name: reference_response_token
sequence: int64
- name: reference_response_token_len
dtype: int64
- name: query_reference_response
dtype: string
- name: query_reference_response_token
sequence: int64
- name: query_reference_response_token_len
dtype: int64
splits:
- name: train
num_bytes: 1600440249
num_examples: 116722
- name: validation
num_bytes: 88425771
num_examples: 6447
- name: test
num_bytes: 89922466
num_examples: 6553
download_size: 551824607
dataset_size: 1778788486
---
# TL;DR SFT Dataset for OpenAI's [Summarize from Feedback](https://openai.com/blog/summarization/) task
The dataset is directly taken from https://github.com/openai/summarize-from-feedback/tree/700967448d10004279f138666442bf1497d0e705#reddit-tldr-dataset
These columns are taken directly from the aforementioned dataset:
* **id**: unique identifier for the post
* **subreddit**: subreddit the post was taken from
* **title**: title of the post
* **post**: body of the post
* **summary**: summary of the post
* **reference_response**: reference response for the post
These columns are added by this preprocessing script:
* **query**: length-limited query for summarization: OAI pre-processes the main text (title + subreddit + post), ensuring it has only 512 tokens; if the main text is too long, then it tries to truncate at the last `
`. If it's too short it pads the main text ([summarize_from_feedback/tasks.py#L98-L165](https://github.com/openai/summarize-from-feedback/blob/700967448d10004279f138666442bf1497d0e705/summarize_from_feedback/tasks.py#L98-L165)). Padding is either space or `[PAD]` token (see Args below).
* **query_token**: tokenized version of `query`
* **reference_response_token**: tokenized version of `reference_response`
* **reference_response_token_len**: length of `reference_response_token`
* **query_reference_response**: concatenation of `query.strip()` and `reference_response`
* **query_reference_response_token**: tokenized version of `query_reference_response`, up to `max_sft_query_response_length` tokens
* **query_reference_response_token_len**: length of `query_reference_response_token`
# Args
```python
{'base_model': 'EleutherAI/pythia-160m',
'hf_entity': 'cleanrl',
'max_rm_query_response_length': 638,
'max_rm_response_length': 169,
'max_sft_query_response_length': 562,
'max_sft_response_length': 53,
'oai_params': TaskQueryHParams(length=512,
format_str='SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
truncate_field='post',
truncate_text='\n',
padding=[50277],
pad_side='left'),
'push_to_hub': True}
{'format_str': 'SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
'length': 512,
'pad_side': 'left',
'padding': [50277],
'truncate_field': 'post',
'truncate_text': '\n'}
```
|
zwang199/autonlp-data-traffic_nlp_binary | ---
language:
- en
task_categories:
- text-classification
---
# AutoNLP Dataset for project: traffic_nlp_binary
## Table of content
- [Dataset Description](#dataset-description)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
## Dataset Descritpion
This dataset has been automatically processed by AutoNLP for project traffic_nlp_binary.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "1 train is still delayed in both directions",
"target": 1
},
{
"text": "maybe there was no train traffic ????. i know the feeling.",
"target": 1
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"target": "ClassLabel(num_classes=2, names=['0', '1'], names_file=None, id=None)",
"text": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 2195 |
| valid | 549 |
|
jvhoffbauer/gsm8k-toolcalls | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: equations
sequence: string
- name: depths
sequence: int64
- name: toolcalls
sequence:
sequence: string
- name: answer_number
dtype: float64
splits:
- name: train
num_bytes: 2849714.8986265534
num_examples: 4128
- name: test
num_bytes: 555113
num_examples: 791
- name: eval
num_bytes: 316865.1013734467
num_examples: 459
download_size: 1871367
dataset_size: 3721693.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: eval
path: data/eval-*
---
|
vladman-25/flickr-30k-romanian-captions | ---
license: unknown
---
# Dataset Card for Flickr 30k Romanian Captions
### Dataset Summary
This dataset is a translation in romanian of the flickr 30k captions dataset.
This was generated using [nllb-200-distilled-1.3B](https://huggingface.co/facebook/nllb-200-distilled-1.3B), with Hugging face for both tokenization and translation.
Observations:
* the translation keeps the context pretty well.
* there are a few grammatical errors: "Doi tineri sare peste un balustradă"
* some translations are silly: "Un bărbat ţine o jucărie mare de leu împăiat.", "Un bărbat cu barbă care poartă un dulap."
### Languages
romanian |
malucoelhaofc/DylanEnglishV2 | ---
license: openrail
---
|
zhaospei/cmg-codellama | ---
license: mit
task_categories:
- text2text-generation
language:
- en
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tyzhu/fw_bi_num_train_10000_eval_100 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval_find_word
path: data/eval_find_word-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 2135417
num_examples: 30200
- name: eval_find_word
num_bytes: 4823
num_examples: 100
download_size: 930254
dataset_size: 2140240
---
# Dataset Card for "fw_bi_num_train_10000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
khad55/wheat_heads | ---
license: mit
---
|
KayoSilva88777/Allan | ---
license: bigscience-openrail-m
---
|
mteb-pt/sprintduplicatequestions-pairclassification | ---
configs:
- config_name: pt-br
data_files:
- split: test
path: test_trans*
- split: validation
path: validation_trans*
--- |
paulooww/newteto | ---
license: openrail
---
|
arbml/ArSL21L | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': ain
'1': al
'2': aleff
'3': bb
'4': dal
'5': dha
'6': dhad
'7': fa
'8': gaaf
'9': ghain
'10': ha
'11': haa
'12': jeem
'13': kaaf
'14': khaa
'15': la
'16': laam
'17': meem
'18': nun
'19': ra
'20': saad
'21': seen
'22': sheen
'23': ta
'24': taa
'25': thaa
'26': thal
'27': toot
'28': waw
'29': ya
'30': yaa
'31': zay
splits:
- name: train
num_bytes: 647055283.152
num_examples: 14202
download_size: 846084553
dataset_size: 647055283.152
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage: [info]**
- **Repository: [info]**
- **Paper: [info]**
- **Leaderboard: [info]**
- **Point of Contact: [info]**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset. |
HuggingFaceM4/FairFace-Sample | Invalid username or password. |
open-llm-leaderboard/details_csitfun__llama-7b-logicot | ---
pretty_name: Evaluation run of csitfun/llama-7b-logicot
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [csitfun/llama-7b-logicot](https://huggingface.co/csitfun/llama-7b-logicot) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_csitfun__llama-7b-logicot\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T23:32:58.123828](https://huggingface.co/datasets/open-llm-leaderboard/details_csitfun__llama-7b-logicot/blob/main/results_2023-10-24T23-32-58.123828.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0020973154362416107,\n\
\ \"em_stderr\": 0.0004685065030368161,\n \"f1\": 0.05921245805369133,\n\
\ \"f1_stderr\": 0.0013293292378975478,\n \"acc\": 0.3378058405682715,\n\
\ \"acc_stderr\": 0.006578612863320816\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0020973154362416107,\n \"em_stderr\": 0.0004685065030368161,\n\
\ \"f1\": 0.05921245805369133,\n \"f1_stderr\": 0.0013293292378975478\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.675611681136543,\n\
\ \"acc_stderr\": 0.013157225726641632\n }\n}\n```"
repo_url: https://huggingface.co/csitfun/llama-7b-logicot
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T23_32_58.123828
path:
- '**/details_harness|drop|3_2023-10-24T23-32-58.123828.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T23-32-58.123828.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T23_32_58.123828
path:
- '**/details_harness|gsm8k|5_2023-10-24T23-32-58.123828.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T23-32-58.123828.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T23_32_58.123828
path:
- '**/details_harness|winogrande|5_2023-10-24T23-32-58.123828.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T23-32-58.123828.parquet'
- config_name: results
data_files:
- split: 2023_10_24T23_32_58.123828
path:
- results_2023-10-24T23-32-58.123828.parquet
- split: latest
path:
- results_2023-10-24T23-32-58.123828.parquet
---
# Dataset Card for Evaluation run of csitfun/llama-7b-logicot
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/csitfun/llama-7b-logicot
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [csitfun/llama-7b-logicot](https://huggingface.co/csitfun/llama-7b-logicot) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_csitfun__llama-7b-logicot",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T23:32:58.123828](https://huggingface.co/datasets/open-llm-leaderboard/details_csitfun__llama-7b-logicot/blob/main/results_2023-10-24T23-32-58.123828.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0020973154362416107,
"em_stderr": 0.0004685065030368161,
"f1": 0.05921245805369133,
"f1_stderr": 0.0013293292378975478,
"acc": 0.3378058405682715,
"acc_stderr": 0.006578612863320816
},
"harness|drop|3": {
"em": 0.0020973154362416107,
"em_stderr": 0.0004685065030368161,
"f1": 0.05921245805369133,
"f1_stderr": 0.0013293292378975478
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.675611681136543,
"acc_stderr": 0.013157225726641632
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
giux78/10000-30000-ultrafeedback-ita | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_sft
num_bytes: 147505067
num_examples: 20000
- name: test_sft
num_bytes: 154695659
num_examples: 23110
- name: train_gen
num_bytes: 1347396812
num_examples: 256032
- name: test_gen
num_bytes: 148276089
num_examples: 28304
download_size: 970019812
dataset_size: 1797873627
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
- split: train_gen
path: data/train_gen-*
- split: test_gen
path: data/test_gen-*
---
|
ChocolateBlack/Inori | ---
license: apache-2.0
---
|
ccao/monkey | ---
license: bsd
---
|
christophsonntag/OLID | ---
multilinguality:
- monolingual
paperswithcode_id: olid
task_categories:
- text-classification
language:
- en
annotations_creators:
- crowdsourced
pretty_name: Offensive Language Identification Dataset
configs:
- config_name: 1.0.0
data_files:
- split: train
path: train.csv
- split: test
path: test.csv
dataset_info:
- config_name: 1.0.0
features:
- name: id
dtype: int64
- name: tweet
dtype: string
- name: cleaned_tweet
dtype: string
- name: subtask_a
dtype: string
- name: subtask_b
dtype: string
- name: subtask_c
dtype: string
splits:
- name: train
num_examples: 13240
- name: test
num_examples: 860
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
The Offensice Language Identification Dataset (OLID) contains 14,100 annotated tweets from Twitter, annotated with three subcategories via crowdsourcing and has been released together with
the paper [Predicting the Type and Target of Offensive Posts in Social Media](https://arxiv.org/abs/1902.09666).
Previous datasets mainly focused on detecting specific types of offensive messages (hate speech, cyberbulling, etc.) but did not consider offensive language as a whole.
This dataset is annoated using a hierarchical annotation with up to 3 labels corresponding to offensive language detection (OFF/NOT),
automatic categorization of offense types (TIN/UNT) and offense target identification (IND/GRP/OTH), described below.
The original data from the [GitHub repo]() is located in ```data/```, I joined the all separate files into two train and test splits, usable with HF datasets.
## Dataset Details
"The gold labels were assigned taking the agreement of three annotators into consideration. No correction has been carried out on the crowdsourcing annotations.
Twitter user mentions were substituted by @USER and URLs have been substitute by URL.
OLID is annotated using a hierarchical annotation. Each instance contains up to 3 labels each corresponding to one of the following levels:
- Level (or sub-task) A: Offensive language identification;
- Level (or sub-task) B: Automatic categorization of offense types;
- Level (or sub-task) C: Offense target identification." ([Source](https://github.com/idontflow/OLID?tab=readme-ov-file#readme))
### Tasks and Labels ([Source](https://github.com/idontflow/OLID?tab=readme-ov-file#readme))
(A) Level A: Offensive language identification
- (NOT) Not Offensive - This post does not contain offense or profanity.
- (OFF) Offensive - This post contains offensive language or a targeted (veiled or direct) offense
In our annotation, we label a post as offensive (OFF) if it contains any form of non-acceptable language (profanity) or a targeted offense, which can be veiled or direct.
(B) Level B: Automatic categorization of offense types
- (TIN) Targeted Insult and Threats - A post containing an insult or threat to an individual, a group, or others (see categories in sub-task C).
- (UNT) Untargeted - A post containing non-targeted profanity and swearing.
Posts containing general profanity are not targeted, but they contain non-acceptable language.
(C) Level C: Offense target identification
- (IND) Individual - The target of the offensive post is an individual: a famous person, a named individual or an unnamed person interacting in the conversation.
- (GRP) Group - The target of the offensive post is a group of people considered as a unity due to the same ethnicity, gender or sexual orientation, political affiliation, religious belief, or something else.
- (OTH) Other – The target of the offensive post does not belong to any of the previous two categories (e.g., an organization, a situation, an event, or an issue)
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** English
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [GitHub Repository](https://github.com/idontflow/OLID)
- **Paper [optional]:** [Predicting the Type and Target of Offensive Posts in Social Media](https://arxiv.org/abs/1902.09666)
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
The goal of this dataset was
[More Information Needed]
### Source Data
The data originates from Twitter
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
The authors retrieved the samples "from Twitter using its API and searching for keywords and constructions that are often included in
offensive messages, such as ‘she is’ or ‘to:BreitBartNews’" ([Source](https://arxiv.org/pdf/1902.09666.pdf)).
They used the following keywords (except for the first three rows)
| Keyword | Offensive % |
|-------------------|-------------|
| medical marijuana | 0.0 |
| they are | 5.9 |
| to:NewYorker | 8.3 |
| --------- | ----- |
| you are | 21.0 |
| she is | 26.6 |
| to:BreitBartNews | 31.6 |
| he is | 32.4 |
| gun control | 34.7 |
| -filter:safe | 58.9 |
| conservatives | 23.2 |
| antifa | 26.7 |
| MAGA | 27.7 |
| liberals | 38.0 |
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
Extensive information on this can be found in the [original paper](https://arxiv.org/pdf/1902.09666.pdf) in the Data Collection section.
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
The annotation has been executed in a crowdsourcing process, where the gold label has been created by considering the annotations of three different annotators.
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
Usernames have been replaced by "USER", URL's by "URL".
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed]
|
open-llm-leaderboard/details_giraffe176__Open_Neural_Monarch_Maidv0.2 | ---
pretty_name: Evaluation run of giraffe176/Open_Neural_Monarch_Maidv0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [giraffe176/Open_Neural_Monarch_Maidv0.2](https://huggingface.co/giraffe176/Open_Neural_Monarch_Maidv0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_giraffe176__Open_Neural_Monarch_Maidv0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-01T07:32:07.995703](https://huggingface.co/datasets/open-llm-leaderboard/details_giraffe176__Open_Neural_Monarch_Maidv0.2/blob/main/results_2024-03-01T07-32-07.995703.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.642246743461646,\n\
\ \"acc_stderr\": 0.03206523313341268,\n \"acc_norm\": 0.6450886784981518,\n\
\ \"acc_norm_stderr\": 0.03270482941566527,\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.01611412415688245,\n \"mc2\": 0.4303534866042077,\n\
\ \"mc2_stderr\": 0.014830285179105224\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5964163822525598,\n \"acc_stderr\": 0.014337158914268441,\n\
\ \"acc_norm\": 0.6331058020477816,\n \"acc_norm_stderr\": 0.014084133118104298\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6276638119896435,\n\
\ \"acc_stderr\": 0.004824393076826621,\n \"acc_norm\": 0.8260306711810397,\n\
\ \"acc_norm_stderr\": 0.0037830836739860636\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n\
\ \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n\
\ \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n\
\ \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n\
\ \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n\
\ \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"\
acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8282828282828283,\n \"acc_stderr\": 0.026869716187429914,\n \"\
acc_norm\": 0.8282828282828283,\n \"acc_norm_stderr\": 0.026869716187429914\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941187,\n\
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941187\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.02931820364520686,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.02931820364520686\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291932,\n\
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291932\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.726457399103139,\n\
\ \"acc_stderr\": 0.029918586707798827,\n \"acc_norm\": 0.726457399103139,\n\
\ \"acc_norm_stderr\": 0.029918586707798827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097652,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097652\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579832,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579832\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.024332146779134128,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.024332146779134128\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3106145251396648,\n\
\ \"acc_stderr\": 0.015476515438005566,\n \"acc_norm\": 0.3106145251396648,\n\
\ \"acc_norm_stderr\": 0.015476515438005566\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.024954184324879912,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.024954184324879912\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713002,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713002\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4641460234680574,\n\
\ \"acc_stderr\": 0.012737361318730583,\n \"acc_norm\": 0.4641460234680574,\n\
\ \"acc_norm_stderr\": 0.012737361318730583\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146293,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146293\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.018850084696468712,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.018850084696468712\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.027979823538744546,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.027979823538744546\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.01611412415688245,\n \"mc2\": 0.4303534866042077,\n\
\ \"mc2_stderr\": 0.014830285179105224\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7908445146014207,\n \"acc_stderr\": 0.011430450045881575\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5610310841546626,\n \
\ \"acc_stderr\": 0.013669500369036205\n }\n}\n```"
repo_url: https://huggingface.co/giraffe176/Open_Neural_Monarch_Maidv0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|arc:challenge|25_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|gsm8k|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hellaswag|10_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T07-32-07.995703.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T07-32-07.995703.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- '**/details_harness|winogrande|5_2024-03-01T07-32-07.995703.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-01T07-32-07.995703.parquet'
- config_name: results
data_files:
- split: 2024_03_01T07_32_07.995703
path:
- results_2024-03-01T07-32-07.995703.parquet
- split: latest
path:
- results_2024-03-01T07-32-07.995703.parquet
---
# Dataset Card for Evaluation run of giraffe176/Open_Neural_Monarch_Maidv0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [giraffe176/Open_Neural_Monarch_Maidv0.2](https://huggingface.co/giraffe176/Open_Neural_Monarch_Maidv0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_giraffe176__Open_Neural_Monarch_Maidv0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-01T07:32:07.995703](https://huggingface.co/datasets/open-llm-leaderboard/details_giraffe176__Open_Neural_Monarch_Maidv0.2/blob/main/results_2024-03-01T07-32-07.995703.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.642246743461646,
"acc_stderr": 0.03206523313341268,
"acc_norm": 0.6450886784981518,
"acc_norm_stderr": 0.03270482941566527,
"mc1": 0.3047735618115055,
"mc1_stderr": 0.01611412415688245,
"mc2": 0.4303534866042077,
"mc2_stderr": 0.014830285179105224
},
"harness|arc:challenge|25": {
"acc": 0.5964163822525598,
"acc_stderr": 0.014337158914268441,
"acc_norm": 0.6331058020477816,
"acc_norm_stderr": 0.014084133118104298
},
"harness|hellaswag|10": {
"acc": 0.6276638119896435,
"acc_stderr": 0.004824393076826621,
"acc_norm": 0.8260306711810397,
"acc_norm_stderr": 0.0037830836739860636
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8282828282828283,
"acc_stderr": 0.026869716187429914,
"acc_norm": 0.8282828282828283,
"acc_norm_stderr": 0.026869716187429914
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941187,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941187
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.02931820364520686,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.02931820364520686
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291932,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291932
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650155,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.726457399103139,
"acc_stderr": 0.029918586707798827,
"acc_norm": 0.726457399103139,
"acc_norm_stderr": 0.029918586707798827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097652,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097652
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707781,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707781
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579832,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579832
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.024332146779134128,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.024332146779134128
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3106145251396648,
"acc_stderr": 0.015476515438005566,
"acc_norm": 0.3106145251396648,
"acc_norm_stderr": 0.015476515438005566
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.024954184324879912,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.024954184324879912
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713002,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713002
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4641460234680574,
"acc_stderr": 0.012737361318730583,
"acc_norm": 0.4641460234680574,
"acc_norm_stderr": 0.012737361318730583
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146293,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146293
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.018850084696468712,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.018850084696468712
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.027979823538744546,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.027979823538744546
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3047735618115055,
"mc1_stderr": 0.01611412415688245,
"mc2": 0.4303534866042077,
"mc2_stderr": 0.014830285179105224
},
"harness|winogrande|5": {
"acc": 0.7908445146014207,
"acc_stderr": 0.011430450045881575
},
"harness|gsm8k|5": {
"acc": 0.5610310841546626,
"acc_stderr": 0.013669500369036205
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
cyanelis/4349 | ---
license: cc-by-nc-4.0
--- |
MagicHub/general-prompt-dataset | ---
license: cc-by-4.0
---
|
joujiboi/Tsukasa-Diffusion | ---
license: apache-2.0
---
|
kheopss/dataset_11M_Enno-Ai_EN_f2.0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: text2
dtype: string
splits:
- name: train
num_bytes: 18427799990
num_examples: 11794112
download_size: 3992719418
dataset_size: 18427799990
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AdapterOcean/Open_Platypus_standardized_cluster_9 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 31316918
num_examples: 3397
download_size: 8579024
dataset_size: 31316918
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Open_Platypus_standardized_cluster_9"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TrainingDataPro/anti-spoofing_Real | ---
license: cc-by-nc-nd-4.0
task_categories:
- image-to-image
- video-classification
language:
- en
tags:
- code
dataset_info:
features:
- name: phone
dtype: string
- name: selfie
dtype: image
- name: video
dtype: string
- name: worker_id
dtype: string
- name: age
dtype: int8
- name: country
dtype: string
- name: gender
dtype: string
splits:
- name: train
num_bytes: 100634313
num_examples: 30
download_size: 568013310
dataset_size: 100634313
---
# Anti-Spoofing dataset: real
The dataset consists of 98,000 videos and selfies with unique people.
# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market/anti-spoofing-real?utm_source=huggingface&utm_medium=cpc&utm_campaign=anti-spoofing_Real) to discuss your requirements, learn about the price and buy the dataset.
# File with the extension .csv
includes the following information for each media file:
- **phone**: the device used to capture the media files,
- **selfie_link**: the URL to access the photo
- **video_link**: the URL to access the video
- **worker_id**: the identifier of the person who provided the media file,
- **age**: the age of the person,
- **country**: the country of origin of the person,
- **gender**: the gender of the person,
- **selfie_file_type**: the type of the photo,
- **video_file_type**: the type of the video
# Folder "img" with media files
- containg all the photos and videos
- which correspond to the data in the .csv file
**How it works**: *go to the first folder and you will make sure that it contains media files taken by a person whose parameters are specified in the first line of the .csv file.*
## [**TrainingData**](https://trainingdata.pro/data-market/anti-spoofing-real?utm_source=huggingface&utm_medium=cpc&utm_campaign=anti-spoofing_Real) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **<https://www.kaggle.com/trainingdatapro/datasets>**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** |
aryopg/mini_pubmed | ---
dataset_info:
features:
- name: abstract
dtype: string
splits:
- name: train
num_bytes: 13296774
num_examples: 10000
download_size: 7578772
dataset_size: 13296774
---
# Dataset Card for "mini_pubmed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Eson-llm/Dureader_Retrierval_ColBERTFormat | ---
license: mit
---
数据来自百度Dureader_Retrieval。
数据的格式根据训练ColBERT所需要的格式转化而来
包括以下三个文件:
|
open-llm-leaderboard/details_Dampish__Dante-2.8B | ---
pretty_name: Evaluation run of Dampish/Dante-2.8B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Dampish/Dante-2.8B](https://huggingface.co/Dampish/Dante-2.8B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Dampish__Dante-2.8B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T13:26:29.842810](https://huggingface.co/datasets/open-llm-leaderboard/details_Dampish__Dante-2.8B/blob/main/results_2023-09-17T13-26-29.842810.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n\
\ \"em_stderr\": 0.00037786091964607033,\n \"f1\": 0.0017051174496644293,\n\
\ \"f1_stderr\": 0.00040455681041866965,\n \"acc\": 0.255327545382794,\n\
\ \"acc_stderr\": 0.007024647268145198\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001363255033557047,\n \"em_stderr\": 0.00037786091964607033,\n\
\ \"f1\": 0.0017051174496644293,\n \"f1_stderr\": 0.00040455681041866965\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.510655090765588,\n\
\ \"acc_stderr\": 0.014049294536290396\n }\n}\n```"
repo_url: https://huggingface.co/Dampish/Dante-2.8B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T13_26_29.842810
path:
- '**/details_harness|drop|3_2023-09-17T13-26-29.842810.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T13-26-29.842810.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T13_26_29.842810
path:
- '**/details_harness|gsm8k|5_2023-09-17T13-26-29.842810.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T13-26-29.842810.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T13_26_29.842810
path:
- '**/details_harness|winogrande|5_2023-09-17T13-26-29.842810.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T13-26-29.842810.parquet'
- config_name: results
data_files:
- split: 2023_09_17T13_26_29.842810
path:
- results_2023-09-17T13-26-29.842810.parquet
- split: latest
path:
- results_2023-09-17T13-26-29.842810.parquet
---
# Dataset Card for Evaluation run of Dampish/Dante-2.8B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Dampish/Dante-2.8B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Dampish/Dante-2.8B](https://huggingface.co/Dampish/Dante-2.8B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Dampish__Dante-2.8B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T13:26:29.842810](https://huggingface.co/datasets/open-llm-leaderboard/details_Dampish__Dante-2.8B/blob/main/results_2023-09-17T13-26-29.842810.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964607033,
"f1": 0.0017051174496644293,
"f1_stderr": 0.00040455681041866965,
"acc": 0.255327545382794,
"acc_stderr": 0.007024647268145198
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964607033,
"f1": 0.0017051174496644293,
"f1_stderr": 0.00040455681041866965
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.510655090765588,
"acc_stderr": 0.014049294536290396
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
BeIR/hotpotqa-generated-queries | ---
annotations_creators: []
language_creators: []
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
paperswithcode_id: beir
pretty_name: BEIR Benchmark
size_categories:
msmarco:
- 1M<n<10M
trec-covid:
- 100k<n<1M
nfcorpus:
- 1K<n<10K
nq:
- 1M<n<10M
hotpotqa:
- 1M<n<10M
fiqa:
- 10K<n<100K
arguana:
- 1K<n<10K
touche-2020:
- 100K<n<1M
cqadupstack:
- 100K<n<1M
quora:
- 100K<n<1M
dbpedia:
- 1M<n<10M
scidocs:
- 10K<n<100K
fever:
- 1M<n<10M
climate-fever:
- 1M<n<10M
scifact:
- 1K<n<10K
source_datasets: []
task_categories:
- text-retrieval
- zero-shot-retrieval
- information-retrieval
- zero-shot-information-retrieval
task_ids:
- passage-retrieval
- entity-linking-retrieval
- fact-checking-retrieval
- tweet-retrieval
- citation-prediction-retrieval
- duplication-question-retrieval
- argument-retrieval
- news-retrieval
- biomedical-information-retrieval
- question-answering-retrieval
---
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset. |
LangChainDatasets/langchain-howto-queries | ---
dataset_info:
features:
- name: inputs
dtype: string
splits:
- name: train
num_bytes: 3419
num_examples: 50
download_size: 2769
dataset_size: 3419
---
# Dataset Card for "langchain-howto-queries"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/an_94_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of an_94/AN-94/AN-94 (Girls' Frontline)
This is the dataset of an_94/AN-94/AN-94 (Girls' Frontline), containing 500 images and their tags.
The core tags of this character are `long_hair, bangs, hairband, blonde_hair, blue_eyes, aqua_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 693.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/an_94_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 377.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/an_94_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1134 | 773.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/an_94_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 603.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/an_94_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1134 | 1.09 GiB | [Download](https://huggingface.co/datasets/CyberHarem/an_94_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/an_94_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 34 |  |  |  |  |  | 1girl, assault_rifle, holding_gun, jacket, solo, black_gloves, tactical_clothes, long_sleeves, closed_mouth, standing, shorts, looking_at_viewer, white_background |
| 1 | 7 |  |  |  |  |  | 1girl, assault_rifle, holding_gun, jacket, long_sleeves, mouth_mask, solo, black_gloves, boots, shorts |
| 2 | 13 |  |  |  |  |  | 1girl, black_gloves, long_sleeves, looking_at_viewer, solo, jacket, simple_background, closed_mouth, tactical_clothes, upper_body, white_background, sidelocks |
| 3 | 8 |  |  |  |  |  | 1girl, jacket, solo, upper_body, closed_mouth, white_background, looking_at_viewer, looking_away, simple_background, mask |
| 4 | 20 |  |  |  |  |  | hair_ribbon, 1girl, black_dress, collarbone, solo, black_gloves, necklace, braid, black_ribbon, closed_mouth, looking_at_viewer, off-shoulder_dress, blush, bare_shoulders, belt, black_hairband, alternate_costume, simple_background, sitting, white_background, sidelocks |
| 5 | 8 |  |  |  |  |  | blue_sky, day, black_bikini, looking_at_viewer, outdoors, 1girl, beach, blush, cleavage, closed_mouth, medium_breasts, solo, bare_shoulders, cloud, collarbone, navel, wet, holding, ocean, standing, thighs, alternate_breast_size, assault_rifle, green_eyes, large_breasts, side-tie_bikini_bottom, underboob, very_long_hair |
| 6 | 9 |  |  |  |  |  | 1girl, solo, black_skirt, looking_at_viewer, white_shirt, pleated_skirt, simple_background, closed_mouth, sailor_collar, serafuku, sitting, white_background, blush, holding, neckerchief, open_mouth, socks, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | assault_rifle | holding_gun | jacket | solo | black_gloves | tactical_clothes | long_sleeves | closed_mouth | standing | shorts | looking_at_viewer | white_background | mouth_mask | boots | simple_background | upper_body | sidelocks | looking_away | mask | hair_ribbon | black_dress | collarbone | necklace | braid | black_ribbon | off-shoulder_dress | blush | bare_shoulders | belt | black_hairband | alternate_costume | sitting | blue_sky | day | black_bikini | outdoors | beach | cleavage | medium_breasts | cloud | navel | wet | holding | ocean | thighs | alternate_breast_size | green_eyes | large_breasts | side-tie_bikini_bottom | underboob | very_long_hair | black_skirt | white_shirt | pleated_skirt | sailor_collar | serafuku | neckerchief | open_mouth | socks |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------------|:--------------|:---------|:-------|:---------------|:-------------------|:---------------|:---------------|:-----------|:---------|:--------------------|:-------------------|:-------------|:--------|:--------------------|:-------------|:------------|:---------------|:-------|:--------------|:--------------|:-------------|:-----------|:--------|:---------------|:---------------------|:--------|:-----------------|:-------|:-----------------|:--------------------|:----------|:-----------|:------|:---------------|:-----------|:--------|:-----------|:-----------------|:--------|:--------|:------|:----------|:--------|:---------|:------------------------|:-------------|:----------------|:-------------------------|:------------|:-----------------|:--------------|:--------------|:----------------|:----------------|:-----------|:--------------|:-------------|:--------|
| 0 | 34 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | | X | | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 13 |  |  |  |  |  | X | | | X | X | X | X | X | X | | | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | | | X | X | | | | X | | | X | X | | | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 20 |  |  |  |  |  | X | | | | X | X | | | X | | | X | X | | | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | X | | | X | | | | X | X | | X | | | | | | | | | | | X | | | | | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | | | | X | | | | X | X | | X | X | | | X | | | | | | | | | | | | X | | | | | X | | | | | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X |
|
linlanio/lldataset-zhishi-v1 | ---
license: apache-2.0
task_categories:
- summarization
language:
- zh
tags:
- biology
size_categories:
- 10K<n<100K
---
# 数据集
## 介绍
## 特点
## 如何使用
## 参考资料
1. https://github.com/QwenLM/Qwen-7B
## 联系我们
网站:https://www.linlan.io
邮箱:contact@linlan.io |
open-llm-leaderboard/details_microsoft__Orca-2-13b | ---
pretty_name: Evaluation run of microsoft/Orca-2-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [microsoft/Orca-2-13b](https://huggingface.co/microsoft/Orca-2-13b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_microsoft__Orca-2-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-30T00:44:18.166149](https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__Orca-2-13b/blob/main/results_2023-12-30T00-44-18.166149.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.601679092820444,\n\
\ \"acc_stderr\": 0.03296876808787226,\n \"acc_norm\": 0.6064308784221981,\n\
\ \"acc_norm_stderr\": 0.03364034807631641,\n \"mc1\": 0.3990208078335373,\n\
\ \"mc1_stderr\": 0.017142825728496767,\n \"mc2\": 0.5642038222037025,\n\
\ \"mc2_stderr\": 0.01593463688746652\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5742320819112628,\n \"acc_stderr\": 0.014449464278868802,\n\
\ \"acc_norm\": 0.6092150170648464,\n \"acc_norm_stderr\": 0.014258563880513778\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6126269667396933,\n\
\ \"acc_stderr\": 0.004861544478451861,\n \"acc_norm\": 0.798546106353316,\n\
\ \"acc_norm_stderr\": 0.004002665957282747\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.03554180368025689,\n\
\ \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.03554180368025689\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796005,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796005\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6736111111111112,\n\
\ \"acc_stderr\": 0.03921067198982266,\n \"acc_norm\": 0.6736111111111112,\n\
\ \"acc_norm_stderr\": 0.03921067198982266\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.032469569197899575,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.032469569197899575\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36772486772486773,\n \"acc_stderr\": 0.02483383982556242,\n \"\
acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.02483383982556242\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n\
\ \"acc_stderr\": 0.02499305339776481,\n \"acc_norm\": 0.7387096774193549,\n\
\ \"acc_norm_stderr\": 0.02499305339776481\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397447,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397447\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5923076923076923,\n \"acc_stderr\": 0.024915243985987847,\n\
\ \"acc_norm\": 0.5923076923076923,\n \"acc_norm_stderr\": 0.024915243985987847\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066475,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066475\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \
\ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8146788990825689,\n \"acc_stderr\": 0.01665927970029582,\n \"\
acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.01665927970029582\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854052,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854052\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579647,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579647\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709697,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709697\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690876,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690876\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7816091954022989,\n\
\ \"acc_stderr\": 0.014774358319934504,\n \"acc_norm\": 0.7816091954022989,\n\
\ \"acc_norm_stderr\": 0.014774358319934504\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.025131000233647897,\n\
\ \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.025131000233647897\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3128491620111732,\n\
\ \"acc_stderr\": 0.01550689259464727,\n \"acc_norm\": 0.3128491620111732,\n\
\ \"acc_norm_stderr\": 0.01550689259464727\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.026925654653615697,\n\
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.026925654653615697\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n\
\ \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n\
\ \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4367666232073012,\n\
\ \"acc_stderr\": 0.01266770191960367,\n \"acc_norm\": 0.4367666232073012,\n\
\ \"acc_norm_stderr\": 0.01266770191960367\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5698529411764706,\n \"acc_stderr\": 0.030074971917302875,\n\
\ \"acc_norm\": 0.5698529411764706,\n \"acc_norm_stderr\": 0.030074971917302875\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6062091503267973,\n \"acc_stderr\": 0.019766211991073066,\n \
\ \"acc_norm\": 0.6062091503267973,\n \"acc_norm_stderr\": 0.019766211991073066\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.03115715086935557,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.03115715086935557\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3990208078335373,\n\
\ \"mc1_stderr\": 0.017142825728496767,\n \"mc2\": 0.5642038222037025,\n\
\ \"mc2_stderr\": 0.01593463688746652\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7655880031570639,\n \"acc_stderr\": 0.011906130106237988\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.378316906747536,\n \
\ \"acc_stderr\": 0.013358407831777126\n }\n}\n```"
repo_url: https://huggingface.co/microsoft/Orca-2-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|arc:challenge|25_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|arc:challenge|25_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|drop|3_2023-11-23T09-00-59.774377.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-23T09-00-59.774377.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|gsm8k|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|gsm8k|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hellaswag|10_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hellaswag|10_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T09-00-59.774377.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T00-44-18.166149.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T00-44-18.166149.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- '**/details_harness|winogrande|5_2023-11-23T09-00-59.774377.parquet'
- split: 2023_12_30T00_44_18.166149
path:
- '**/details_harness|winogrande|5_2023-12-30T00-44-18.166149.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-30T00-44-18.166149.parquet'
- config_name: results
data_files:
- split: 2023_11_23T09_00_59.774377
path:
- results_2023-11-23T09-00-59.774377.parquet
- split: 2023_12_30T00_44_18.166149
path:
- results_2023-12-30T00-44-18.166149.parquet
- split: latest
path:
- results_2023-12-30T00-44-18.166149.parquet
---
# Dataset Card for Evaluation run of microsoft/Orca-2-13b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [microsoft/Orca-2-13b](https://huggingface.co/microsoft/Orca-2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_microsoft__Orca-2-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T00:44:18.166149](https://huggingface.co/datasets/open-llm-leaderboard/details_microsoft__Orca-2-13b/blob/main/results_2023-12-30T00-44-18.166149.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.601679092820444,
"acc_stderr": 0.03296876808787226,
"acc_norm": 0.6064308784221981,
"acc_norm_stderr": 0.03364034807631641,
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496767,
"mc2": 0.5642038222037025,
"mc2_stderr": 0.01593463688746652
},
"harness|arc:challenge|25": {
"acc": 0.5742320819112628,
"acc_stderr": 0.014449464278868802,
"acc_norm": 0.6092150170648464,
"acc_norm_stderr": 0.014258563880513778
},
"harness|hellaswag|10": {
"acc": 0.6126269667396933,
"acc_stderr": 0.004861544478451861,
"acc_norm": 0.798546106353316,
"acc_norm_stderr": 0.004002665957282747
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.03554180368025689,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.03554180368025689
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796005,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796005
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6736111111111112,
"acc_stderr": 0.03921067198982266,
"acc_norm": 0.6736111111111112,
"acc_norm_stderr": 0.03921067198982266
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.032469569197899575,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.032469569197899575
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36772486772486773,
"acc_stderr": 0.02483383982556242,
"acc_norm": 0.36772486772486773,
"acc_norm_stderr": 0.02483383982556242
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.02499305339776481,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.02499305339776481
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397447,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397447
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5923076923076923,
"acc_stderr": 0.024915243985987847,
"acc_norm": 0.5923076923076923,
"acc_norm_stderr": 0.024915243985987847
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066475,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066475
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.01665927970029582,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.01665927970029582
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854052,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854052
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579647,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579647
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709697,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709697
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690876,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690876
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7816091954022989,
"acc_stderr": 0.014774358319934504,
"acc_norm": 0.7816091954022989,
"acc_norm_stderr": 0.014774358319934504
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6791907514450867,
"acc_stderr": 0.025131000233647897,
"acc_norm": 0.6791907514450867,
"acc_norm_stderr": 0.025131000233647897
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3128491620111732,
"acc_stderr": 0.01550689259464727,
"acc_norm": 0.3128491620111732,
"acc_norm_stderr": 0.01550689259464727
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.026925654653615697,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.026925654653615697
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4367666232073012,
"acc_stderr": 0.01266770191960367,
"acc_norm": 0.4367666232073012,
"acc_norm_stderr": 0.01266770191960367
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5698529411764706,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.5698529411764706,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6062091503267973,
"acc_stderr": 0.019766211991073066,
"acc_norm": 0.6062091503267973,
"acc_norm_stderr": 0.019766211991073066
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.03115715086935557,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.03115715086935557
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496767,
"mc2": 0.5642038222037025,
"mc2_stderr": 0.01593463688746652
},
"harness|winogrande|5": {
"acc": 0.7655880031570639,
"acc_stderr": 0.011906130106237988
},
"harness|gsm8k|5": {
"acc": 0.378316906747536,
"acc_stderr": 0.013358407831777126
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
loraxian/reddit-ootl-answers | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license: []
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
- text2text-generation
task_ids:
- text-scoring
pretty_name: r/OutOfTheLoop Questions and Answers
dataset_info:
features:
- name: body
dtype: string
- name: score_comment
dtype: int64
- name: link_id
dtype: string
- name: comment_id
dtype: string
- name: created_comment
dtype: string
- name: has_link_comment
dtype: bool
- name: title
dtype: string
- name: selftext
dtype: string
- name: score_submission
dtype: int64
- name: created_submission
dtype: string
- name: has_link_submission
dtype: bool
splits:
- name: train
num_bytes: 55558875
num_examples: 42152
download_size: 24532400
dataset_size: 55558875
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- reddit
- outoftheloop
---
## Dataset Description
This dataset includes all Reddit comments from the OutOfTheLoop subreddit between 2019-03 and 2023-02 which start with the text "**Answer:**".
Each row includes:
* body - Comment text
* score_comment - Reddit voted score of the comment
* comment_id - ID of comment
* link_id - ID of parent post
* created_comment - Date comment was created
* has_link_comment - Whether the comment text includes 'http://' or 'https://'
* title - Title of parent post
* selftext - Text of parent post
* has_link_submission - Whether the parent post selftext includes 'http://' or 'https://'
* score_submission - Score of parent post
* created_submission - Date parent post was created |
teo-sanchez/diffusiondb_ner | ---
layout: default
title: "Name Entity Recognition of DiffusionDB"
nav_order: 1
has_children: false
language_creators:
- found
language:
- en
license:
- cc-by-3.0
multilinguality:
- monolingual
pretty_name: NER-DiffusionDB
size_categories:
- 100M<n<1G
source_datasets:
- poloclub/diffusiondb
tags:
- stable diffusion
- prompt engineering
- prompts
- research paper
---

### Description
Extended dataset infered by the name entity recognition model [en_ner_prompting](https://huggingface.co/teo-sanchez/en_ner_prompting). This model has been trained on hand-annotated prompts from [poloclub/diffusiondb](https://huggingface.co/datasets/poloclub/diffusiondb).
This dataset is hence infered by this model and can comprise mistakes, especially on certain categories (cf. model card).
The entities comprise 7 main categories and 11 subcategories for a total of 16 categories, extracted from a topic analysis made with [BERTopic](https://maartengr.github.io/BERTopic/index.html).
The topic analysis can be explored [the following visualization](https://teo-sanchez.github.io/projects/prompting_map.html).
```
├── medium/
│ ├── photography
│ ├── painting
│ ├── rendering
│ └── illustration
├── influence/
│ ├── artist
│ ├── genre
│ ├── artwork
│ └── repository
├── light
├── color
├── composition
├── detail
└── context/
├── era
├── weather
└── emotion
```
### Label Scheme
<details>
<summary>View label scheme (16 labels for 1 components)</summary>
| Component | Labels |
| --- | --- |
| **`ner`** | `color`, `composition`, `context/emotion`, `context/era`, `context/weather`, `detail`, `influence/artist`, `influence/artwork`, `influence/genre`, `influence/repository`, `light`, `medium/illustration`, `medium/painting`, `medium/photography`, `medium/rendering`, `subject` |
</details> |
k0ntra/salam | ---
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
splits:
- name: train
num_bytes: 9216
num_examples: 3
download_size: 0
dataset_size: 9216
---
# Dataset Card for "salam"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_134 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1210938704.0
num_examples: 237812
download_size: 1236761748
dataset_size: 1210938704.0
---
# Dataset Card for "chunk_134"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HartreeCentre/JustiaCorpus | ---
annotations_creators:
- expert-generated
language_creators:
- found
- expert-generated
language:
- en
license: cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- summarization
- text2text-generation
task_ids:
- text-simplification
pretty_name: 'LegalOps: A summarisation corpus of Federal and Supreme Court Opinions
from the Justia Portal'
dataset_info:
- config_name: default
splits:
- name: train
num_bytes: 33980817
num_examples: 1022
download_size: 17759423
dataset_size: 33980817
- config_name: federal
features:
- name: fulltext
dtype: string
- name: summary
dtype: string
- name: tag
dtype: string
- name: url
dtype: string
- name: file_urls
sequence: string
- name: files
list:
- name: checksum
dtype: string
- name: path
dtype: string
- name: status
dtype: string
- name: url
dtype: string
- name: metadata
struct:
- name: court_id
dtype: string
- name: date
dtype: string
- name: number
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 625693535
num_examples: 284011
download_size: 309803008
dataset_size: 625693535
- config_name: federal-clean
features:
- name: fulltext
dtype: string
- name: summary
dtype: string
- name: tag
dtype: string
- name: url
dtype: string
- name: file_urls
sequence: string
- name: files
list:
- name: checksum
dtype: string
- name: path
dtype: string
- name: status
dtype: string
- name: url
dtype: string
- name: metadata
struct:
- name: court_id
dtype: string
- name: date
dtype: string
- name: number
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 585523330
num_examples: 16745
download_size: 301381329
dataset_size: 585523330
- config_name: supreme
features:
- name: Syllabus
dtype: string
- name: Dissent
dtype: string
- name: Opinion
dtype: string
- name: summary
dtype: string
- name: tag
dtype: string
- name: url
dtype: string
- name: file_urls
sequence: string
- name: files
list:
- name: checksum
dtype: string
- name: path
dtype: string
- name: status
dtype: string
- name: url
dtype: string
- name: metadata
struct:
- name: Advocates
dtype: string
- name: Argued
dtype: string
- name: Decided
dtype: string
- name: Docket No.
dtype: string
- name: First Party
dtype: string
- name: Granted
dtype: string
- name: Juris Postponed
dtype: string
- name: Official Citation
dtype: string
- name: Reargued
dtype: string
- name: Second Party
dtype: string
- name: page
dtype: int64
- name: volume
dtype: int64
splits:
- name: train
num_bytes: 33894538
num_examples: 1022
download_size: 17739369
dataset_size: 33894538
- config_name: supreme-clean
features:
- name: Syllabus
dtype: string
- name: Dissent
dtype: string
- name: Opinion
dtype: string
- name: summary
dtype: string
- name: tag
dtype: string
- name: url
dtype: string
- name: file_urls
sequence: string
- name: files
list:
- name: checksum
dtype: string
- name: path
dtype: string
- name: status
dtype: string
- name: url
dtype: string
- name: metadata
struct:
- name: Advocates
dtype: string
- name: Argued
dtype: string
- name: Decided
dtype: string
- name: Docket No.
dtype: string
- name: First Party
dtype: string
- name: Granted
dtype: string
- name: Juris Postponed
dtype: string
- name: Official Citation
dtype: string
- name: Reargued
dtype: string
- name: Second Party
dtype: string
- name: page
dtype: int64
- name: volume
dtype: int64
splits:
- name: train
num_bytes: 31766337
num_examples: 593
download_size: 17053691
dataset_size: 31766337
configs:
- config_name: default
data_files:
- split: train
path: supreme-clean/train-*
- config_name: federal
data_files:
- split: train
path: federal/train-*
- config_name: federal-clean
data_files:
- split: train
path: federal-clean/train-*
- config_name: supreme
data_files:
- split: train
path: supreme/train-*
- config_name: supreme-clean
data_files:
- split: train
path: supreme-clean/train-*
tags:
- legal
---
# Dataset Card for LegalOps JustiaCorpus
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **https://github.com/stfc/Justia-LegalOps**
- **https://github.com/stfc/Justia-LegalOps**
- **Paper:**
- **Leaderboard:**
- **robert.firth@stfc.ac.uk**
### Dataset Summary
This is an English-Language dataset consisting of US Supreme and Federal Court cases, with fulltexts and summaries. The dataset comprises approximately 600 Supreme court cases with Summaries and Syllabi, as well as those with missing data scraped from https://supreme.justia.com/.
As the highest court in the nation, the U.S. Supreme Court has shaped the rights and freedoms of Americans since the Founding. Justia provides a free collection of all U.S. Supreme Court decisions from 1791 to the present.
The federal court data is scraped from https://law.justia.com/cases/federal/, it is more sparse than the Supreme Court data, with approximately 17,000 Cases out of a total of 284,000 scraped from the records with non-zero length Fulltext and Summaries.
| Dataset Split | Number of Rows |
| --------- | --------- |
| `federal` | 284011 |
| `federal-clean` | 16818 |
| `supreme` | 1022 |
| `supreme-clean` | 593 |
### Supported Tasks and Leaderboards
* Text Summarisation
### Languages
* English - (n.b. - The BCP-47 code for English as generally spoken in the United States is en-US)
## Dataset Structure
### Data Instances
The data instances for the splits differ slightly between the Federal and Supreme Court data
#### Supreme Court Data
Each instance in the "supreme" consists of a string fulltext field containing the Syllabus of the case (`Syllabus`), a string summary field containing a human-written summary of the case. There are two other significant data fields for the case, one containing the opinion (`Opinion`), the other (optionally) the Dissent (`Dissent`). The other fields are metadata for the case scrape, a string tag for identifiying "federal" or "supreme" court should the splits be combined (`tag`), the scraped URL (`url`), a list of a dictionary of metadata for the case itself (`metadata`), a list of file URLs (`file_urls`) and a more complete data structure (`files`) consisting of a list of dictionaries for each PDF downloaded. The "path" key in this datastructure relates to the location within the associate PDF tarball (see (# Additional Data)[# Additional Data])
```
{
"Syllabus": "See United States v. Detroit Timber & Lumber Co.,\n\n200 U.S. 321, 337.\nSUPREME COURT OF THE UNITED STATES\nSyllabus\nGREENE, aka TRICE v. FISHER, SUPERINTENDENT, STATE CORRECTIONAL INSTITUTION AT SMITHFIELD, et al.\ncertiorari to the united states court of appeals for the third circuit\nNo. 10\u2013637.\u2003Argued October 11, 2011\u2014Decided November 8, 2011\nDuring petitioner Greene\u2019s trial for murder, robbery, and conspiracy, the prosecution introduced the redacted confessions of two of Greene\u2019s nontestifying codefendants. A jury convicted Greene. The Pennsylvania Superior Court...",
"Dissent": "",
"Opinion": "SUPREME COURT OF THE UNITED STATES\n_________________\nNo. 10\u2013637\n_________________\nERIC GREENE, aka JARMAINE Q. TRICE, PETI- TIONER v. JON FISHER, SUPERINTENDENT, STATE CORRECTIONAL INSTITUTION AT SMITHFIELD, et al.\non writ of certiorari to the united states court of appeals for the third circuit\n[November 8, 2011]\n\nJustice Scalia delivered the opinion of the Court.\nUnder the Antiterrorism and Effective Death Penalty Act of 1996 (AEDPA), a federal court may not grant habeas relief to a state prisoner...,
"tag": "supreme",
"url": "https://supreme.justia.com/cases/federal/us/565/34/",
"file_urls": [
"https://supreme.justia.com/cases/federal/us/565/10-637/case.pdf"
],
"files": [
{
"checksum": "7364db9dec242c4bf751cddd1082c714",
"path": "full/aff7f0b60e06bcdc14db0962db7a187460cf3d6e.pdf",
"status": "downloaded",
"url": "https://supreme.justia.com/cases/federal/us/565/10-637/case.pdf"
}
],
"metadata": {
"Advocates": null,
"Argued": "October 11, 2011",
"Decided": "November 8, 2011",
"Docket No.": "10-637",
"First Party": "Eric Greene, aka Jarmaine Q. Trice",
"Granted": "April 4, 2011",
"Juris Postponed": null,
"Official Citation": "565 U.S. 34",
"Reargued": null,
"Second Party": "Jon Fisher, Superintendent, State Correctional Institution at Smithfield, et al.",
"page": 34,
"volume": 565
},
}
```
#### Federal Court Data
Each instance consists of a string fulltext field containing the fulltext of the case, a string summary field containing a human-written summary of the case. The other fields are metadata for the case scrape, a string tag for identifiying "federal" or "supreme" court should the splits be combined (`tag`), the scraped URL (`url`), a list of a dictionary of metadata for the case itself (`metadata`), a list of file URLs (`file_urls`) and a more complete data structure (`files`) consisting of a list of dictionaries for each PDF downloaded. The "path" key in this datastructure relates to the location within the associate PDF tarball (see (# Additional Data)[# Additional Data])
```json
{
"fulltext": "Appeal from judgment of the United States District Court for the Western District of New York (Telesca, J.). The district court denied Petitioner habeas corpus relief after finding that Petitioner did not derive citizenship from his father; the district court ruled that Petitioner was not in his father s legal custody when his father naturalized. We conclude that the district court erred because it relied on an unenforceable custody award. Legal custody ...",
"summary": "Petitioner appealed from the district court's denial of habeas corpus relief after finding that he did not derive citizenship from his father. The district court ruled that petitioner was not in his father's \"legal custody\" when his father naturalized. The court concluded that the district court erred because it relied on an unenforceable Dominican Republic custody award where New York had jurisdiction to determine custody. Accordingly, the court vacated the judgment and remanded for further proceedings.",
"tag": "federal",
"url": "https://law.justia.com/cases/federal/appellate-courts/ca2/09-4211/09-4211_opn-2011-12-29.html",
"file_urls": [
"https://cases.justia.com/federal/appellate-courts/ca2/09-4211/09-4211_opn-2011-12-29.pdf"
],
"files": [
{
"checksum": "c48f9dd5a186a0e4dde4259085d99840",
"path": "full/e38b7ce3ff0e4f83a100f5e2cc57552591d033b0.pdf",
"status": "downloaded",
"url": "https://cases.justia.com/federal/appellate-courts/ca2/09-4211/09-4211_opn-2011-12-29.pdf"
}
],
"metadata": {
"court_id": "ca2",
"date": "2011-12-29",
"number": "09-4211",
"title": "Garcia v. USICE (Dept. of Homeland Security), No. 09-4211 (2d Cir. 2011)"
}
}
```
### Data Fields
[More Information Needed]
### Data Splits
This dataset is comprised of four splits: a source-quality version (`federal`/`supreme`)and a pre-processed version (`federal-clean`/`supreme-clean`). These are **not** currently further subdivided into `train/test/eval` splits. The default split is the `supreme` split.
```
JustiaCorpus/
├── README.md
├── federal/
├── federal-clean/
├── supreme/
└── supreme-clean/
```
The splits can be loaded as follows:
```python
ds = load_dataset(
"HartreeCentre/JustiaCorpus",
split,
token=HF_TOKEN_VALUE,
)["train"]
```
where `split` is one of `["supreme", "federal", "supreme-clean", "federal-clean"]`.
### Additional Data
Within this repository, two tarballs containing the full PDF documents for each dataset can be found in the corresponding `federal/` and `supreme/` datasets.
* `federal/federal_pdfs.tar.gz`
* `supreme/supreme_pdfs.tar.gz`
to extract these, clone this dataset repo, navigate to the directory and untar:
```bash
tar -xzvf federal_pdfs.tar.gz
```
The mapping between case and PDF is stored in the "files" field in the dataset.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Personal and Sensitive Information
This dataset is not anonymized, so individuals' names can be found in the dataset
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@INPROCEEDINGS{9378308,
author={Gargett, Andrew and Firth, Rob and Aletras, Nikolaos},
booktitle={2020 IEEE International Conference on Big Data (Big Data)},
title={LegalOps: A Summarization Corpus of Legal Opinions},
year={2020},
volume={},
number={},
pages={2117-2120},
doi={10.1109/BigData50022.2020.9378308}}
```
### Contributions
Thanks to [@RobFirth](https://github.com/RobFirth) for adding this dataset. |
OliveerEx/minhavoz | ---
license: openrail
---
|
liuyanchen1015/MULTI_VALUE_mnli_object_pronoun_drop | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 273865
num_examples: 1349
- name: dev_mismatched
num_bytes: 296997
num_examples: 1341
- name: test_matched
num_bytes: 274972
num_examples: 1275
- name: test_mismatched
num_bytes: 256848
num_examples: 1249
- name: train
num_bytes: 11047920
num_examples: 51298
download_size: 7523940
dataset_size: 12150602
---
# Dataset Card for "MULTI_VALUE_mnli_object_pronoun_drop"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
H4438/hieu-edu-date | ---
dataset_info:
features:
- name: title
dtype: string
- name: body
dtype: string
- name: url
dtype: string
- name: dates
sequence: string
- name: est_date
dtype: string
- name: ext_dates
sequence: string
- name: flt_dates
sequence: string
splits:
- name: train
num_bytes: 413651688
num_examples: 30758
download_size: 0
dataset_size: 413651688
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "hieu-edu-date"
Left: 5626 rows - 0.18 %
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CitrusBoy/NewsArticles | ---
license: mit
---
|
simmo/CanlIICaseSummaries | ---
license: apache-2.0
task_categories:
- summarization
- text-generation
language:
- en
tags:
- legal
size_categories:
- n<1K
---
# Canadian Case Law Summaries
A database of (currently, still growing) >600 case law summaries generated by GPT 4 for random case law in Ontario or Canada |
Pushpahasa/body_wash | ---
license: openrail
task_categories:
- text-classification
language:
- en
tags:
- chemistry
pretty_name: body_wash
size_categories:
- n<1K
--- |
AdapterOcean/med_alpaca_standardized_cluster_11_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 12169229
num_examples: 7665
download_size: 6591472
dataset_size: 12169229
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_11_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xuanmo/xbcm | ---
license: cc0-1.0
task_categories:
- text-generation
language:
- zh
tags:
- not-for-all-audiences
pretty_name: pri_xbcm
size_categories:
- 100B<n<1T
--- |
open-llm-leaderboard/details_MaziyarPanahi__YamshadowInex12_Experiment26T3q | ---
pretty_name: Evaluation run of MaziyarPanahi/YamshadowInex12_Experiment26T3q
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MaziyarPanahi/YamshadowInex12_Experiment26T3q](https://huggingface.co/MaziyarPanahi/YamshadowInex12_Experiment26T3q)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__YamshadowInex12_Experiment26T3q\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T10:27:46.019757](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__YamshadowInex12_Experiment26T3q/blob/main/results_2024-04-09T10-27-46.019757.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6510455186561941,\n\
\ \"acc_stderr\": 0.032057149961613414,\n \"acc_norm\": 0.6501342340668828,\n\
\ \"acc_norm_stderr\": 0.03273135985814274,\n \"mc1\": 0.6352509179926561,\n\
\ \"mc1_stderr\": 0.016850961061720134,\n \"mc2\": 0.7835453305304184,\n\
\ \"mc2_stderr\": 0.01361341647369438\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7158703071672355,\n \"acc_stderr\": 0.013179442447653886,\n\
\ \"acc_norm\": 0.7295221843003413,\n \"acc_norm_stderr\": 0.012980954547659556\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7175861382194781,\n\
\ \"acc_stderr\": 0.004492535748097627,\n \"acc_norm\": 0.8925512846046604,\n\
\ \"acc_norm_stderr\": 0.003090499801090434\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778394,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778394\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.02328766512726855,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.02328766512726855\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993466,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993466\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4212290502793296,\n\
\ \"acc_stderr\": 0.0165136760311796,\n \"acc_norm\": 0.4212290502793296,\n\
\ \"acc_norm_stderr\": 0.0165136760311796\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967287,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967287\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.012751075788015055,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.012751075788015055\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6352509179926561,\n\
\ \"mc1_stderr\": 0.016850961061720134,\n \"mc2\": 0.7835453305304184,\n\
\ \"mc2_stderr\": 0.01361341647369438\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8492501973164956,\n \"acc_stderr\": 0.010056094631479674\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6952236542835482,\n \
\ \"acc_stderr\": 0.012679297549515427\n }\n}\n```"
repo_url: https://huggingface.co/MaziyarPanahi/YamshadowInex12_Experiment26T3q
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|arc:challenge|25_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|gsm8k|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hellaswag|10_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|winogrande|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T10-27-46.019757.parquet'
- config_name: results
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- results_2024-04-09T10-27-46.019757.parquet
- split: latest
path:
- results_2024-04-09T10-27-46.019757.parquet
---
# Dataset Card for Evaluation run of MaziyarPanahi/YamshadowInex12_Experiment26T3q
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MaziyarPanahi/YamshadowInex12_Experiment26T3q](https://huggingface.co/MaziyarPanahi/YamshadowInex12_Experiment26T3q) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__YamshadowInex12_Experiment26T3q",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T10:27:46.019757](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__YamshadowInex12_Experiment26T3q/blob/main/results_2024-04-09T10-27-46.019757.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6510455186561941,
"acc_stderr": 0.032057149961613414,
"acc_norm": 0.6501342340668828,
"acc_norm_stderr": 0.03273135985814274,
"mc1": 0.6352509179926561,
"mc1_stderr": 0.016850961061720134,
"mc2": 0.7835453305304184,
"mc2_stderr": 0.01361341647369438
},
"harness|arc:challenge|25": {
"acc": 0.7158703071672355,
"acc_stderr": 0.013179442447653886,
"acc_norm": 0.7295221843003413,
"acc_norm_stderr": 0.012980954547659556
},
"harness|hellaswag|10": {
"acc": 0.7175861382194781,
"acc_stderr": 0.004492535748097627,
"acc_norm": 0.8925512846046604,
"acc_norm_stderr": 0.003090499801090434
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778394,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778394
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726855,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726855
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993466,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993466
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4212290502793296,
"acc_stderr": 0.0165136760311796,
"acc_norm": 0.4212290502793296,
"acc_norm_stderr": 0.0165136760311796
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967287,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967287
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015055,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015055
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6352509179926561,
"mc1_stderr": 0.016850961061720134,
"mc2": 0.7835453305304184,
"mc2_stderr": 0.01361341647369438
},
"harness|winogrande|5": {
"acc": 0.8492501973164956,
"acc_stderr": 0.010056094631479674
},
"harness|gsm8k|5": {
"acc": 0.6952236542835482,
"acc_stderr": 0.012679297549515427
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
igbo_ner | ---
annotations_creators:
- found
language_creators:
- found
language:
- ig
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
paperswithcode_id: null
pretty_name: Igbo NER dataset
dataset_info:
- config_name: ner_data
features:
- name: content_n
dtype: string
- name: named_entity
dtype: string
- name: sentences
sequence: string
splits:
- name: train
num_bytes: 60315228
num_examples: 30715
download_size: 3311204
dataset_size: 60315228
- config_name: free_text
features:
- name: sentences
dtype: string
splits:
- name: train
num_bytes: 1172152
num_examples: 10000
download_size: 1132151
dataset_size: 1172152
---
# Dataset Card for Igbo NER dataset
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/IgnatiusEzeani/IGBONLP/tree/master/ig_ner
- **Repository:** https://github.com/IgnatiusEzeani/IGBONLP/tree/master/ig_ner
- **Paper:** https://arxiv.org/abs/2004.00648
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
Here is an example from the dataset:
```
{'content_n': 'content_0', 'named_entity': 'Ike Ekweremmadụ', 'sentences': ['Ike Ekweremmadụ', "Ike ịda jụụ otụ nkeji banyere oke ogbugbu na-eme n'ala Naijiria agwụla Ekweremmadụ"]}
```
### Data Fields
- content_n : ID
- named_entity : Name of the entity
- sentences : List of sentences for the entity
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
@misc{ezeani2020igboenglish,
title={Igbo-English Machine Translation: An Evaluation Benchmark},
author={Ignatius Ezeani and Paul Rayson and Ikechukwu Onyenwe and Chinedu Uchechukwu and Mark Hepple},
year={2020},
eprint={2004.00648},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
### Contributions
Thanks to [@purvimisal](https://github.com/purvimisal) for adding this dataset. |
BangumiBase/masougakuenhxh | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Masou Gakuen Hxh
This is the image base of bangumi Masou Gakuen HxH, we detected 22 characters, 1642 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 183 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 62 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 55 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 160 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 21 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 80 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 488 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 32 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 12 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 5 | [Download](9/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 10 | 80 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 68 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 24 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 32 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 33 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 16 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 38 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 20 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 6 | [Download](18/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 19 | 67 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 7 | [Download](20/dataset.zip) |  |  |  |  |  |  |  | N/A |
| noise | 153 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
Aarif1430/english-to-bengali | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 52673198
num_examples: 183970
download_size: 21745709
dataset_size: 52673198
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-staging-eval-launch__gov_report-plain_text-1abd3a-16146232 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- launch/gov_report
eval_info:
task: summarization
model: google/bigbird-pegasus-large-bigpatent
metrics: ['bertscore']
dataset_name: launch/gov_report
dataset_config: plain_text
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: google/bigbird-pegasus-large-bigpatent
* Dataset: launch/gov_report
* Config: plain_text
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nonchalant-nagavalli](https://huggingface.co/nonchalant-nagavalli) for evaluating this model. |
Praghxx/Rick | ---
license: openrail
---
|
VuongQuoc/60k_dataset_multichoice_512 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: token_type_ids
sequence:
sequence: int8
- name: attention_mask
sequence:
sequence: int8
- name: label
dtype: int64
splits:
- name: train
num_bytes: 77100610
num_examples: 5000
- name: test
num_bytes: 3088000
num_examples: 200
download_size: 7918277
dataset_size: 80188610
---
# Dataset Card for "60k_dataset_multichoice_512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigbio/ctebmsp |
---
language:
- es
bigbio_language:
- Spanish
license: cc-by-nc-4.0
multilinguality: monolingual
bigbio_license_shortname: CC_BY_NC_4p0
pretty_name: CT-EBM-SP
homepage: http://www.lllf.uam.es/ESP/nlpmedterm_en.html
bigbio_pubmed: True
bigbio_public: True
bigbio_tasks:
- NAMED_ENTITY_RECOGNITION
---
# Dataset Card for CT-EBM-SP
## Dataset Description
- **Homepage:** http://www.lllf.uam.es/ESP/nlpmedterm_en.html
- **Pubmed:** True
- **Public:** True
- **Tasks:** NER
### Ctebmsp Abstracts
The "abstracts" subset of the Clinical Trials for Evidence-Based Medicine in Spanish
(CT-EBM-SP) corpus contains 500 abstracts of clinical trial studies in Spanish,
published in journals with a Creative Commons license. Most were downloaded from
the SciELO repository and free abstracts in PubMed.
Abstracts were retrieved with the query:
Clinical Trial[ptyp] AND “loattrfree full text”[sb] AND “spanish”[la].
(Information collected from 10.1186/s12911-021-01395-z)
### Ctebmsp Eudract
The "abstracts" subset of the Clinical Trials for Evidence-Based Medicine in Spanish
(CT-EBM-SP) corpus contains 500 abstracts of clinical trial studies in Spanish,
published in journals with a Creative Commons license. Most were downloaded from
the SciELO repository and free abstracts in PubMed.
Abstracts were retrieved with the query:
Clinical Trial[ptyp] AND “loattrfree full text”[sb] AND “spanish”[la].
(Information collected from 10.1186/s12911-021-01395-z)
## Citation Information
```
@article{CampillosLlanos2021,
author = {Leonardo Campillos-Llanos and
Ana Valverde-Mateos and
Adri{'{a}}n Capllonch-Carri{'{o}}n and
Antonio Moreno-Sandoval},
title = {A clinical trials corpus annotated with {UMLS}
entities to enhance the access to evidence-based medicine},
journal = {{BMC} Medical Informatics and Decision Making},
volume = {21},
year = {2021},
url = {https://doi.org/10.1186/s12911-021-01395-z},
doi = {10.1186/s12911-021-01395-z},
biburl = {},
bibsource = {}
}
```
|
Multimodal-Fatima/Caltech101_with_background_test_facebook_opt_1.3b_Visclues_ns_6084_random | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_1_bs_16
num_bytes: 103748153.5
num_examples: 6084
- name: fewshot_3_bs_16
num_bytes: 107977706.5
num_examples: 6084
download_size: 202932897
dataset_size: 211725860.0
---
# Dataset Card for "Caltech101_with_background_test_facebook_opt_1.3b_Visclues_ns_6084_random"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
biglam/early_printed_books_font_detection_loaded | ---
dataset_info:
features:
- name: image
dtype: image
- name: labels
sequence:
class_label:
names:
0: greek
1: antiqua
2: other_font
3: not_a_font
4: italic
5: rotunda
6: textura
7: fraktur
8: schwabacher
9: hebrew
10: bastarda
11: gotico_antiqua
splits:
- name: test
num_bytes: 11398084794.636
num_examples: 10757
- name: train
num_bytes: 21512059165.866
num_examples: 24866
download_size: 44713803337
dataset_size: 32910143960.502
---
# Dataset Card for "early_printed_books_font_detection_loaded"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TigerResearch/tigerbot-law-plugin | ---
license: apache-2.0
language:
- zh
---
[Tigerbot](https://github.com/TigerResearch/TigerBot) 模型rethink时使用的外脑原始数据,法律11大类,共5.5W+条款
- 宪法
- 刑法
- 行政法
- 司法解释
- 民法商法
- 民法典
- 行政法规
- 社会法
- 部门规章
- 经济法
- 诉讼与非诉讼程序法
<p align="center" width="40%">
## Usage
```python
import datasets
ds_sft = datasets.load_dataset('TigerResearch/tigerbot-law-plugin')
``` |
Riksarkivet/placeholder_line_segmentation | ---
license: mit
task_categories:
- image-segmentation
- object-detection
---
## "Work in progress"
Cooming soon!!
# Dataset
WIP
### volumes
- Göteborgs_poliskammare_före_1900
## Contributions
WIP
## Acknowledgemetns
WIP |
thobauma/harmless-poisoned-0.005-SUDO-murder | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 58402939.44335993
num_examples: 42537
download_size: 31364075
dataset_size: 58402939.44335993
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T21:08:18.556287](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down/blob/main/results_2023-10-25T21-08-18.556287.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.33536073825503354,\n\
\ \"em_stderr\": 0.004834914027583674,\n \"f1\": 0.3733011744966448,\n\
\ \"f1_stderr\": 0.004764578803547237,\n \"acc\": 0.4233861485316003,\n\
\ \"acc_stderr\": 0.009667062706266409\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.33536073825503354,\n \"em_stderr\": 0.004834914027583674,\n\
\ \"f1\": 0.3733011744966448,\n \"f1_stderr\": 0.004764578803547237\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07960576194086429,\n \
\ \"acc_stderr\": 0.007455924338676274\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7671665351223362,\n \"acc_stderr\": 0.011878201073856544\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|arc:challenge|25_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T21_08_18.556287
path:
- '**/details_harness|drop|3_2023-10-25T21-08-18.556287.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T21-08-18.556287.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T21_08_18.556287
path:
- '**/details_harness|gsm8k|5_2023-10-25T21-08-18.556287.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T21-08-18.556287.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hellaswag|10_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-43-00.841479.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T05-43-00.841479.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-04T05-43-00.841479.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T21_08_18.556287
path:
- '**/details_harness|winogrande|5_2023-10-25T21-08-18.556287.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T21-08-18.556287.parquet'
- config_name: results
data_files:
- split: 2023_10_04T05_43_00.841479
path:
- results_2023-10-04T05-43-00.841479.parquet
- split: 2023_10_25T21_08_18.556287
path:
- results_2023-10-25T21-08-18.556287.parquet
- split: latest
path:
- results_2023-10-25T21-08-18.556287.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T21:08:18.556287](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE3_3.3w-r4-q_k_v_o_gate_up_down/blob/main/results_2023-10-25T21-08-18.556287.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.33536073825503354,
"em_stderr": 0.004834914027583674,
"f1": 0.3733011744966448,
"f1_stderr": 0.004764578803547237,
"acc": 0.4233861485316003,
"acc_stderr": 0.009667062706266409
},
"harness|drop|3": {
"em": 0.33536073825503354,
"em_stderr": 0.004834914027583674,
"f1": 0.3733011744966448,
"f1_stderr": 0.004764578803547237
},
"harness|gsm8k|5": {
"acc": 0.07960576194086429,
"acc_stderr": 0.007455924338676274
},
"harness|winogrande|5": {
"acc": 0.7671665351223362,
"acc_stderr": 0.011878201073856544
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tyang816/MedChatZH | ---
license: apache-2.0
task_categories:
- question-answering
language:
- zh
tags:
- medical
- biology
size_categories:
- 100K<n<1M
---
# MedChatZH
## MedChatZH中文微调参数
基于Baichuan-7B在中医书籍数据和医疗指令对话数据上训练,使其具备较强的中文医疗对话能力,**能够理解中医问题**。
Github:[**MedChatZH-7B**](https://github.com/tyang816/MedChatZH)
## 🚀 结果
| 问题 | 回答 |
| :----------------------------------------------------------- | ------------------------------------------------------------ |
| 中医治脱发有效吗脱发5、6年了,明显比以前少了很多,脱发位置比较均衡,没有突出掉的比较多的地方,我的发质比较硬,头发生长也较快。 | 中药治疗脱发的效果因人而异,但对于您这种头发质量好且生长速度快的患者来说,中药可能是一个不错的选择。一些常用的中药方剂包括六味地黄丸、四物汤等,这些药物可以滋养肝肾,促进血液循环和毛囊营养供应,从而改善脱发问题。此外,还可以通过针灸、艾灸等方式进行辅助治疗。但是需要注意的是,中药治疗需要长期坚持才能见效,同时还需要注意饮食和生活习惯等方面的问题,以达到更好的疗效。建议您咨询专业的中医医生或皮肤科医生,根据个人情况制定个性化的治疗方案。 |
| 脑萎缩中医如何治疗?中医治疗脑萎缩好吗?脑萎缩中医如何治疗?中医治疗脑萎缩好吗? | 中药、针灸和推拿等。中药可以根据患者的具体情况进行配方,以达到调节身体机能的目的;针灸可以通过刺激穴位来改善血液循环和神经功能;推拿则可以通过按摩头部和颈部肌肉来缓解疼痛和僵硬感。需要注意的是,中医治疗需要长期坚持才能取得良好的效果,同时还需要结合其他康复措施和生活方式改变,如合理饮食、适量运动、保持心情愉悦等等。如果您有任何疑问或不适,请及时咨询医生。 |
| 中药治疗肺癌效果好吗?麻烦医生了中药治疗肺癌效果好吗?我家老人肺癌晚期,听人说中医可以治疗肺癌真的吗?效果怎么样? | 中药在治疗肺癌方面有一定的作用。但是需要注意的是,中药并不是所有人都适用的,需要根据患者的具体情况来选择合适的药物和剂量。此外,中药的疗效也需要长期观察才能确定。对于您家老人的肺癌晚期,建议及时就医并进行综合治疗,包括手术、放化疗等方法。同时也可以考虑结合中医药辅助治疗,但一定要在专业医师指导下进行。希望您的家人早日康复。 |
| 最近脸上总是冒出小痘痘……为什么呢?我用的是不吃晚饭减肥,其他都正常的。但是大便的情况很不好,我很头疼的,便秘情况太严重了。最近脸上总是冒出痘痘,额头、脸颊、法令纹那一边一个,还有点疼,有白色的浓…… | 你好,根据你的描述,你可能患有痤疮和便秘的问题。长期不吃饭会导致身体缺乏营养物质,从而影响皮肤的健康状况。建议你在饮食方面要注意均衡摄入各种食物,尤其是蛋白质、维生素A、B族以及矿物质等。此外,你可以尝试一些中药调理,如黄连上清丸或龙胆泻肝丸来缓解症状。对于痤疮问题,可以使用外用药膏进行治疗,例如维甲酸类药物或者过氧化苯甲酰凝胶等。如果症状较为严重,也可以考虑口服抗生素。至于便秘问题,可以通过增加膳食纤维摄入量、多喝水等方式改善。同时,避免过度用力排便也是非常重要的。如果你仍然感到不适,请及时就医并咨询专业医生的意见。希望我的回答能对你有所帮助。 |
## **🔥 引用**
如果您使用了我们的模型、代码或数据,请引用:
```
@article{tan2024medchatzh,
title={MedChatZH: A tuning LLM for traditional Chinese medicine consultations},
author={Tan, Yang and Zhang, Zhixing and Li, Mingchen and Pan, Fei and Duan, Hao and Huang, Zijie and Deng, Hua and Yu, Zhuohang and Yang, Chen and Shen, Guoyang and others},
journal={Computers in Biology and Medicine},
pages={108290},
year={2024},
publisher={Elsevier}
}
```
也请同时引用BELLE、LLaMA等项目
## **🐼 使用限制**
- **本项目模型与数据集及其衍生物仅用于研究目的,不得用于商业,以及其他会对社会带来危害的用途。**
- **本项目不代表任何一方的立场、利益或想法,无关任何团体的任何类型的主张,因使用本项目模型、数据集等带来的任何损害、纠纷,本项目不承担任何责任。**
- 在涉及事实性的指令上可能会产生违背事实的错误回答。
- 对于具备危害性的指令无法很好的鉴别,由此会产生危害性言论。
- 在一些涉及推理、代码等场景下模型的能力仍有待提高。 |
Falah/action_actor_prompts_SDXL | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 709387340
num_examples: 1000000
download_size: 85090582
dataset_size: 709387340
---
# Dataset Card for "action_actor_prompts_SDXL"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/metatree_BNG_glass_ | ---
dataset_info:
features:
- name: id
dtype: int64
- name: X
sequence: float64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 8868064
num_examples: 96392
- name: validation
num_bytes: 3807788
num_examples: 41389
download_size: 11206380
dataset_size: 12675852
---
# Dataset Card for "metatree_BNG_glass_"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v1 | ---
pretty_name: Evaluation run of JaeyeonKang/CCK_Asura_v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JaeyeonKang/CCK_Asura_v1](https://huggingface.co/JaeyeonKang/CCK_Asura_v1) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-12T04:58:51.033818](https://huggingface.co/datasets/open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v1/blob/main/results_2024-02-12T04-58-51.033818.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7535469467828841,\n\
\ \"acc_stderr\": 0.028473742983492905,\n \"acc_norm\": 0.7564527472308834,\n\
\ \"acc_norm_stderr\": 0.029025433712812198,\n \"mc1\": 0.565483476132191,\n\
\ \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.7174856574663107,\n\
\ \"mc2_stderr\": 0.014605715133518151\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7013651877133106,\n \"acc_stderr\": 0.013374078615068749,\n\
\ \"acc_norm\": 0.7389078498293515,\n \"acc_norm_stderr\": 0.012835523909473848\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.719577773351922,\n\
\ \"acc_stderr\": 0.004482874732237349,\n \"acc_norm\": 0.8906592312288388,\n\
\ \"acc_norm_stderr\": 0.003114285077228029\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7111111111111111,\n\
\ \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.7111111111111111,\n\
\ \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8552631578947368,\n \"acc_stderr\": 0.0286319518459304,\n\
\ \"acc_norm\": 0.8552631578947368,\n \"acc_norm_stderr\": 0.0286319518459304\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8075471698113208,\n \"acc_stderr\": 0.024262979839372267,\n\
\ \"acc_norm\": 0.8075471698113208,\n \"acc_norm_stderr\": 0.024262979839372267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.026280550932848087,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.026280550932848087\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n\
\ \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.7514450867052023,\n\
\ \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4803921568627451,\n \"acc_stderr\": 0.04971358884367406,\n\
\ \"acc_norm\": 0.4803921568627451,\n \"acc_norm_stderr\": 0.04971358884367406\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n\
\ \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7404255319148936,\n \"acc_stderr\": 0.02865917937429232,\n\
\ \"acc_norm\": 0.7404255319148936,\n \"acc_norm_stderr\": 0.02865917937429232\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n\
\ \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n\
\ \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7241379310344828,\n \"acc_stderr\": 0.03724563619774632,\n\
\ \"acc_norm\": 0.7241379310344828,\n \"acc_norm_stderr\": 0.03724563619774632\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5343915343915344,\n \"acc_stderr\": 0.02569032176249385,\n \"\
acc_norm\": 0.5343915343915344,\n \"acc_norm_stderr\": 0.02569032176249385\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5714285714285714,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.5714285714285714,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.864516129032258,\n\
\ \"acc_stderr\": 0.019469334586486933,\n \"acc_norm\": 0.864516129032258,\n\
\ \"acc_norm_stderr\": 0.019469334586486933\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.034139638059062345,\n\
\ \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.034139638059062345\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\"\
: 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n\
\ \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9040404040404041,\n \"acc_stderr\": 0.020984808610047933,\n \"\
acc_norm\": 0.9040404040404041,\n \"acc_norm_stderr\": 0.020984808610047933\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607558,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607558\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7897435897435897,\n \"acc_stderr\": 0.020660597485026945,\n\
\ \"acc_norm\": 0.7897435897435897,\n \"acc_norm_stderr\": 0.020660597485026945\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.42592592592592593,\n \"acc_stderr\": 0.030149135601365944,\n \
\ \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.030149135601365944\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.865546218487395,\n \"acc_stderr\": 0.022159373072744442,\n \
\ \"acc_norm\": 0.865546218487395,\n \"acc_norm_stderr\": 0.022159373072744442\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248436,\n \"\
acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248436\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9192660550458716,\n \"acc_stderr\": 0.011680172292862086,\n \"\
acc_norm\": 0.9192660550458716,\n \"acc_norm_stderr\": 0.011680172292862086\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6990740740740741,\n \"acc_stderr\": 0.031280390843298804,\n \"\
acc_norm\": 0.6990740740740741,\n \"acc_norm_stderr\": 0.031280390843298804\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073315,\n \"\
acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073315\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9240506329113924,\n \"acc_stderr\": 0.0172446332510657,\n \
\ \"acc_norm\": 0.9240506329113924,\n \"acc_norm_stderr\": 0.0172446332510657\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.02693611191280227,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.02693611191280227\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.02871877688934232,\n\
\ \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.02871877688934232\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9090909090909091,\n \"acc_stderr\": 0.026243194054073878,\n \"\
acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.026243194054073878\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8796296296296297,\n\
\ \"acc_stderr\": 0.0314570385430625,\n \"acc_norm\": 0.8796296296296297,\n\
\ \"acc_norm_stderr\": 0.0314570385430625\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971723,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971723\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6339285714285714,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.6339285714285714,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808629,\n\
\ \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808629\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n\
\ \"acc_stderr\": 0.01700436856813237,\n \"acc_norm\": 0.9273504273504274,\n\
\ \"acc_norm_stderr\": 0.01700436856813237\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8914431673052363,\n\
\ \"acc_stderr\": 0.011124283175851183,\n \"acc_norm\": 0.8914431673052363,\n\
\ \"acc_norm_stderr\": 0.011124283175851183\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8439306358381503,\n \"acc_stderr\": 0.019539014685374036,\n\
\ \"acc_norm\": 0.8439306358381503,\n \"acc_norm_stderr\": 0.019539014685374036\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6636871508379888,\n\
\ \"acc_stderr\": 0.0158010037291459,\n \"acc_norm\": 0.6636871508379888,\n\
\ \"acc_norm_stderr\": 0.0158010037291459\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8300653594771242,\n \"acc_stderr\": 0.02150538312123138,\n\
\ \"acc_norm\": 0.8300653594771242,\n \"acc_norm_stderr\": 0.02150538312123138\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.819935691318328,\n\
\ \"acc_stderr\": 0.02182342285774494,\n \"acc_norm\": 0.819935691318328,\n\
\ \"acc_norm_stderr\": 0.02182342285774494\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8487654320987654,\n \"acc_stderr\": 0.019935086092149886,\n\
\ \"acc_norm\": 0.8487654320987654,\n \"acc_norm_stderr\": 0.019935086092149886\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6134751773049646,\n \"acc_stderr\": 0.02904919034254347,\n \
\ \"acc_norm\": 0.6134751773049646,\n \"acc_norm_stderr\": 0.02904919034254347\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.589960886571056,\n\
\ \"acc_stderr\": 0.012561837621962032,\n \"acc_norm\": 0.589960886571056,\n\
\ \"acc_norm_stderr\": 0.012561837621962032\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.023157468308559345,\n\
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.023157468308559345\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8218954248366013,\n \"acc_stderr\": 0.015478369653108568,\n \
\ \"acc_norm\": 0.8218954248366013,\n \"acc_norm_stderr\": 0.015478369653108568\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8244897959183674,\n \"acc_stderr\": 0.02435280072297001,\n\
\ \"acc_norm\": 0.8244897959183674,\n \"acc_norm_stderr\": 0.02435280072297001\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9154228855721394,\n\
\ \"acc_stderr\": 0.019675343217199173,\n \"acc_norm\": 0.9154228855721394,\n\
\ \"acc_norm_stderr\": 0.019675343217199173\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.95,\n \"acc_stderr\": 0.021904291355759057,\n \
\ \"acc_norm\": 0.95,\n \"acc_norm_stderr\": 0.021904291355759057\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.5783132530120482,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.565483476132191,\n\
\ \"mc1_stderr\": 0.017352738749259564,\n \"mc2\": 0.7174856574663107,\n\
\ \"mc2_stderr\": 0.014605715133518151\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8634569850039463,\n \"acc_stderr\": 0.0096502429002916\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6808188021228203,\n \
\ \"acc_stderr\": 0.012840345676251653\n }\n}\n```"
repo_url: https://huggingface.co/JaeyeonKang/CCK_Asura_v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|arc:challenge|25_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|gsm8k|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hellaswag|10_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T04-58-51.033818.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-12T04-58-51.033818.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- '**/details_harness|winogrande|5_2024-02-12T04-58-51.033818.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-12T04-58-51.033818.parquet'
- config_name: results
data_files:
- split: 2024_02_12T04_58_51.033818
path:
- results_2024-02-12T04-58-51.033818.parquet
- split: latest
path:
- results_2024-02-12T04-58-51.033818.parquet
---
# Dataset Card for Evaluation run of JaeyeonKang/CCK_Asura_v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [JaeyeonKang/CCK_Asura_v1](https://huggingface.co/JaeyeonKang/CCK_Asura_v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-12T04:58:51.033818](https://huggingface.co/datasets/open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v1/blob/main/results_2024-02-12T04-58-51.033818.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7535469467828841,
"acc_stderr": 0.028473742983492905,
"acc_norm": 0.7564527472308834,
"acc_norm_stderr": 0.029025433712812198,
"mc1": 0.565483476132191,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.7174856574663107,
"mc2_stderr": 0.014605715133518151
},
"harness|arc:challenge|25": {
"acc": 0.7013651877133106,
"acc_stderr": 0.013374078615068749,
"acc_norm": 0.7389078498293515,
"acc_norm_stderr": 0.012835523909473848
},
"harness|hellaswag|10": {
"acc": 0.719577773351922,
"acc_stderr": 0.004482874732237349,
"acc_norm": 0.8906592312288388,
"acc_norm_stderr": 0.003114285077228029
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7111111111111111,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.7111111111111111,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8552631578947368,
"acc_stderr": 0.0286319518459304,
"acc_norm": 0.8552631578947368,
"acc_norm_stderr": 0.0286319518459304
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8075471698113208,
"acc_stderr": 0.024262979839372267,
"acc_norm": 0.8075471698113208,
"acc_norm_stderr": 0.024262979839372267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.026280550932848087,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.026280550932848087
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4803921568627451,
"acc_stderr": 0.04971358884367406,
"acc_norm": 0.4803921568627451,
"acc_norm_stderr": 0.04971358884367406
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7404255319148936,
"acc_stderr": 0.02865917937429232,
"acc_norm": 0.7404255319148936,
"acc_norm_stderr": 0.02865917937429232
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7241379310344828,
"acc_stderr": 0.03724563619774632,
"acc_norm": 0.7241379310344828,
"acc_norm_stderr": 0.03724563619774632
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5343915343915344,
"acc_stderr": 0.02569032176249385,
"acc_norm": 0.5343915343915344,
"acc_norm_stderr": 0.02569032176249385
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.864516129032258,
"acc_stderr": 0.019469334586486933,
"acc_norm": 0.864516129032258,
"acc_norm_stderr": 0.019469334586486933
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.034139638059062345,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.034139638059062345
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9040404040404041,
"acc_stderr": 0.020984808610047933,
"acc_norm": 0.9040404040404041,
"acc_norm_stderr": 0.020984808610047933
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.016731085293607558,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.016731085293607558
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7897435897435897,
"acc_stderr": 0.020660597485026945,
"acc_norm": 0.7897435897435897,
"acc_norm_stderr": 0.020660597485026945
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.030149135601365944,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.030149135601365944
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.865546218487395,
"acc_stderr": 0.022159373072744442,
"acc_norm": 0.865546218487395,
"acc_norm_stderr": 0.022159373072744442
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5099337748344371,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.5099337748344371,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9192660550458716,
"acc_stderr": 0.011680172292862086,
"acc_norm": 0.9192660550458716,
"acc_norm_stderr": 0.011680172292862086
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6990740740740741,
"acc_stderr": 0.031280390843298804,
"acc_norm": 0.6990740740740741,
"acc_norm_stderr": 0.031280390843298804
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073315,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073315
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9240506329113924,
"acc_stderr": 0.0172446332510657,
"acc_norm": 0.9240506329113924,
"acc_norm_stderr": 0.0172446332510657
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.02693611191280227,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.02693611191280227
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.02871877688934232,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.02871877688934232
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.026243194054073878,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.026243194054073878
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.0314570385430625,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.0314570385430625
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971723,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971723
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6339285714285714,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.6339285714285714,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808629,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808629
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9273504273504274,
"acc_stderr": 0.01700436856813237,
"acc_norm": 0.9273504273504274,
"acc_norm_stderr": 0.01700436856813237
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8914431673052363,
"acc_stderr": 0.011124283175851183,
"acc_norm": 0.8914431673052363,
"acc_norm_stderr": 0.011124283175851183
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8439306358381503,
"acc_stderr": 0.019539014685374036,
"acc_norm": 0.8439306358381503,
"acc_norm_stderr": 0.019539014685374036
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6636871508379888,
"acc_stderr": 0.0158010037291459,
"acc_norm": 0.6636871508379888,
"acc_norm_stderr": 0.0158010037291459
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8300653594771242,
"acc_stderr": 0.02150538312123138,
"acc_norm": 0.8300653594771242,
"acc_norm_stderr": 0.02150538312123138
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.819935691318328,
"acc_stderr": 0.02182342285774494,
"acc_norm": 0.819935691318328,
"acc_norm_stderr": 0.02182342285774494
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8487654320987654,
"acc_stderr": 0.019935086092149886,
"acc_norm": 0.8487654320987654,
"acc_norm_stderr": 0.019935086092149886
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6134751773049646,
"acc_stderr": 0.02904919034254347,
"acc_norm": 0.6134751773049646,
"acc_norm_stderr": 0.02904919034254347
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.589960886571056,
"acc_stderr": 0.012561837621962032,
"acc_norm": 0.589960886571056,
"acc_norm_stderr": 0.012561837621962032
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.023157468308559345,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.023157468308559345
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8218954248366013,
"acc_stderr": 0.015478369653108568,
"acc_norm": 0.8218954248366013,
"acc_norm_stderr": 0.015478369653108568
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8244897959183674,
"acc_stderr": 0.02435280072297001,
"acc_norm": 0.8244897959183674,
"acc_norm_stderr": 0.02435280072297001
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9154228855721394,
"acc_stderr": 0.019675343217199173,
"acc_norm": 0.9154228855721394,
"acc_norm_stderr": 0.019675343217199173
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.95,
"acc_stderr": 0.021904291355759057,
"acc_norm": 0.95,
"acc_norm_stderr": 0.021904291355759057
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.565483476132191,
"mc1_stderr": 0.017352738749259564,
"mc2": 0.7174856574663107,
"mc2_stderr": 0.014605715133518151
},
"harness|winogrande|5": {
"acc": 0.8634569850039463,
"acc_stderr": 0.0096502429002916
},
"harness|gsm8k|5": {
"acc": 0.6808188021228203,
"acc_stderr": 0.012840345676251653
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Jing24/low-train-all | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
splits:
- name: train
num_bytes: 79730052
num_examples: 87589
download_size: 0
dataset_size: 79730052
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "low-train-all"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HumanCompatibleAI/random-seals-Swimmer-v1 | ---
dataset_info:
features:
- name: obs
sequence:
sequence: float64
- name: acts
sequence:
sequence: float32
- name: infos
sequence: string
- name: terminal
dtype: bool
- name: rews
sequence: float32
splits:
- name: train
num_bytes: 138046530
num_examples: 100
download_size: 36347782
dataset_size: 138046530
---
# Dataset Card for "random-seals-Swimmer-v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pccl-org/formal-logic-simple-order-simple-objects-paired-blivergent-1500 | ---
dataset_info:
features:
- name: greater_than
dtype: string
- name: less_than
dtype: string
- name: paired_example
sequence:
sequence: string
- name: correct_example
sequence: string
- name: incorrect_example
sequence: string
- name: distance
dtype: int64
- name: index
dtype: int64
- name: index_in_distance
dtype: int64
splits:
- name: train
num_bytes: 324299227
num_examples: 1122753
download_size: 89647951
dataset_size: 324299227
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
GalaktischeGurke/emails_500_lines | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3424289
num_examples: 500
download_size: 1831898
dataset_size: 3424289
---
# Dataset Card for "emails_500_lines"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.