datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
zrc000ll/kid | ---
license: openrail
---
|
code_x_glue_ct_code_to_text | ---
annotations_creators:
- found
language_creators:
- found
language:
- code
- en
license:
- c-uda
multilinguality:
- other-programming-languages
size_categories:
- 100K<n<1M
- 10K<n<100K
source_datasets:
- original
task_categories:
- translation
task_ids: []
pretty_name: CodeXGlueCtCodeToText
config_names:
- go
- java
- javascript
- php
- python
- ruby
tags:
- code-to-text
dataset_info:
- config_name: go
features:
- name: id
dtype: int32
- name: repo
dtype: string
- name: path
dtype: string
- name: func_name
dtype: string
- name: original_string
dtype: string
- name: language
dtype: string
- name: code
dtype: string
- name: code_tokens
sequence: string
- name: docstring
dtype: string
- name: docstring_tokens
sequence: string
- name: sha
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 342243143
num_examples: 167288
- name: validation
num_bytes: 13721860
num_examples: 7325
- name: test
num_bytes: 16328406
num_examples: 8122
download_size: 121341698
dataset_size: 372293409
- config_name: java
features:
- name: id
dtype: int32
- name: repo
dtype: string
- name: path
dtype: string
- name: func_name
dtype: string
- name: original_string
dtype: string
- name: language
dtype: string
- name: code
dtype: string
- name: code_tokens
sequence: string
- name: docstring
dtype: string
- name: docstring_tokens
sequence: string
- name: sha
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 452553835
num_examples: 164923
- name: validation
num_bytes: 13366344
num_examples: 5183
- name: test
num_bytes: 29080753
num_examples: 10955
download_size: 154701399
dataset_size: 495000932
- config_name: javascript
features:
- name: id
dtype: int32
- name: repo
dtype: string
- name: path
dtype: string
- name: func_name
dtype: string
- name: original_string
dtype: string
- name: language
dtype: string
- name: code
dtype: string
- name: code_tokens
sequence: string
- name: docstring
dtype: string
- name: docstring_tokens
sequence: string
- name: sha
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 160860431
num_examples: 58025
- name: validation
num_bytes: 10337344
num_examples: 3885
- name: test
num_bytes: 10190713
num_examples: 3291
download_size: 65788314
dataset_size: 181388488
- config_name: php
features:
- name: id
dtype: int32
- name: repo
dtype: string
- name: path
dtype: string
- name: func_name
dtype: string
- name: original_string
dtype: string
- name: language
dtype: string
- name: code
dtype: string
- name: code_tokens
sequence: string
- name: docstring
dtype: string
- name: docstring_tokens
sequence: string
- name: sha
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 614654499
num_examples: 241241
- name: validation
num_bytes: 33283045
num_examples: 12982
- name: test
num_bytes: 35374993
num_examples: 14014
download_size: 219692158
dataset_size: 683312537
- config_name: python
features:
- name: id
dtype: int32
- name: repo
dtype: string
- name: path
dtype: string
- name: func_name
dtype: string
- name: original_string
dtype: string
- name: language
dtype: string
- name: code
dtype: string
- name: code_tokens
sequence: string
- name: docstring
dtype: string
- name: docstring_tokens
sequence: string
- name: sha
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 813663148
num_examples: 251820
- name: validation
num_bytes: 46888564
num_examples: 13914
- name: test
num_bytes: 50659688
num_examples: 14918
download_size: 325551862
dataset_size: 911211400
- config_name: ruby
features:
- name: id
dtype: int32
- name: repo
dtype: string
- name: path
dtype: string
- name: func_name
dtype: string
- name: original_string
dtype: string
- name: language
dtype: string
- name: code
dtype: string
- name: code_tokens
sequence: string
- name: docstring
dtype: string
- name: docstring_tokens
sequence: string
- name: sha
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 51956439
num_examples: 24927
- name: validation
num_bytes: 2821037
num_examples: 1400
- name: test
num_bytes: 2671551
num_examples: 1261
download_size: 21921316
dataset_size: 57449027
configs:
- config_name: go
data_files:
- split: train
path: go/train-*
- split: validation
path: go/validation-*
- split: test
path: go/test-*
- config_name: java
data_files:
- split: train
path: java/train-*
- split: validation
path: java/validation-*
- split: test
path: java/test-*
- config_name: javascript
data_files:
- split: train
path: javascript/train-*
- split: validation
path: javascript/validation-*
- split: test
path: javascript/test-*
- config_name: php
data_files:
- split: train
path: php/train-*
- split: validation
path: php/validation-*
- split: test
path: php/test-*
- config_name: python
data_files:
- split: train
path: python/train-*
- split: validation
path: python/validation-*
- split: test
path: python/test-*
- config_name: ruby
data_files:
- split: train
path: ruby/train-*
- split: validation
path: ruby/validation-*
- split: test
path: ruby/test-*
---
# Dataset Card for "code_x_glue_ct_code_to_text"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits-sample-size)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/microsoft/CodeXGLUE/tree/main/Code-Text/code-to-text
### Dataset Summary
CodeXGLUE code-to-text dataset, available at https://github.com/microsoft/CodeXGLUE/tree/main/Code-Text/code-to-text
The dataset we use comes from CodeSearchNet and we filter the dataset as the following:
- Remove examples that codes cannot be parsed into an abstract syntax tree.
- Remove examples that #tokens of documents is < 3 or >256
- Remove examples that documents contain special tokens (e.g. <img ...> or https:...)
- Remove examples that documents are not English.
### Supported Tasks and Leaderboards
- `machine-translation`: The dataset can be used to train a model for automatically generating **English** docstrings for code.
### Languages
- Go **programming** language
- Java **programming** language
- Javascript **programming** language
- PHP **programming** language
- Python **programming** language
- Ruby **programming** language
- English **natural** language
## Dataset Structure
### Data Instances
#### go
An example of 'test' looks as follows.
```
{
"code": "func NewSTM(c *v3.Client, apply func(STM) error, so ...stmOption) (*v3.TxnResponse, error) {\n\topts := &stmOptions{ctx: c.Ctx()}\n\tfor _, f := range so {\n\t\tf(opts)\n\t}\n\tif len(opts.prefetch) != 0 {\n\t\tf := apply\n\t\tapply = func(s STM) error {\n\t\t\ts.Get(opts.prefetch...)\n\t\t\treturn f(s)\n\t\t}\n\t}\n\treturn runSTM(mkSTM(c, opts), apply)\n}",
"code_tokens": ["func", "NewSTM", "(", "c", "*", "v3", ".", "Client", ",", "apply", "func", "(", "STM", ")", "error", ",", "so", "...", "stmOption", ")", "(", "*", "v3", ".", "TxnResponse", ",", "error", ")", "{", "opts", ":=", "&", "stmOptions", "{", "ctx", ":", "c", ".", "Ctx", "(", ")", "}", "\n", "for", "_", ",", "f", ":=", "range", "so", "{", "f", "(", "opts", ")", "\n", "}", "\n", "if", "len", "(", "opts", ".", "prefetch", ")", "!=", "0", "{", "f", ":=", "apply", "\n", "apply", "=", "func", "(", "s", "STM", ")", "error", "{", "s", ".", "Get", "(", "opts", ".", "prefetch", "...", ")", "\n", "return", "f", "(", "s", ")", "\n", "}", "\n", "}", "\n", "return", "runSTM", "(", "mkSTM", "(", "c", ",", "opts", ")", ",", "apply", ")", "\n", "}"],
"docstring": "// NewSTM initiates a new STM instance, using serializable snapshot isolation by default.",
"docstring_tokens": ["NewSTM", "initiates", "a", "new", "STM", "instance", "using", "serializable", "snapshot", "isolation", "by", "default", "."],
"func_name": "NewSTM",
"id": 0,
"language": "go",
"original_string": "func NewSTM(c *v3.Client, apply func(STM) error, so ...stmOption) (*v3.TxnResponse, error) {\n\topts := &stmOptions{ctx: c.Ctx()}\n\tfor _, f := range so {\n\t\tf(opts)\n\t}\n\tif len(opts.prefetch) != 0 {\n\t\tf := apply\n\t\tapply = func(s STM) error {\n\t\t\ts.Get(opts.prefetch...)\n\t\t\treturn f(s)\n\t\t}\n\t}\n\treturn runSTM(mkSTM(c, opts), apply)\n}",
"path": "clientv3/concurrency/stm.go",
"repo": "etcd-io/etcd",
"sha": "616592d9ba993e3fe9798eef581316016df98906",
"url": "https://github.com/etcd-io/etcd/blob/616592d9ba993e3fe9798eef581316016df98906/clientv3/concurrency/stm.go#L89-L102"
}
```
#### java
An example of 'test' looks as follows.
```
{
"code": "protected final void fastPathOrderedEmit(U value, boolean delayError, Disposable disposable) {\n final Observer<? super V> observer = downstream;\n final SimplePlainQueue<U> q = queue;\n\n if (wip.get() == 0 && wip.compareAndSet(0, 1)) {\n if (q.isEmpty()) {\n accept(observer, value);\n if (leave(-1) == 0) {\n return;\n }\n } else {\n q.offer(value);\n }\n } else {\n q.offer(value);\n if (!enter()) {\n return;\n }\n }\n QueueDrainHelper.drainLoop(q, observer, delayError, disposable, this);\n }",
"code_tokens": ["protected", "final", "void", "fastPathOrderedEmit", "(", "U", "value", ",", "boolean", "delayError", ",", "Disposable", "disposable", ")", "{", "final", "Observer", "<", "?", "super", "V", ">", "observer", "=", "downstream", ";", "final", "SimplePlainQueue", "<", "U", ">", "q", "=", "queue", ";", "if", "(", "wip", ".", "get", "(", ")", "==", "0", "&&", "wip", ".", "compareAndSet", "(", "0", ",", "1", ")", ")", "{", "if", "(", "q", ".", "isEmpty", "(", ")", ")", "{", "accept", "(", "observer", ",", "value", ")", ";", "if", "(", "leave", "(", "-", "1", ")", "==", "0", ")", "{", "return", ";", "}", "}", "else", "{", "q", ".", "offer", "(", "value", ")", ";", "}", "}", "else", "{", "q", ".", "offer", "(", "value", ")", ";", "if", "(", "!", "enter", "(", ")", ")", "{", "return", ";", "}", "}", "QueueDrainHelper", ".", "drainLoop", "(", "q", ",", "observer", ",", "delayError", ",", "disposable", ",", "this", ")", ";", "}"],
"docstring": "Makes sure the fast-path emits in order.\n@param value the value to emit or queue up\n@param delayError if true, errors are delayed until the source has terminated\n@param disposable the resource to dispose if the drain terminates",
"docstring_tokens": ["Makes", "sure", "the", "fast", "-", "path", "emits", "in", "order", "."],
"func_name": "QueueDrainObserver.fastPathOrderedEmit",
"id": 0,
"language": "java",
"original_string": "protected final void fastPathOrderedEmit(U value, boolean delayError, Disposable disposable) {\n final Observer<? super V> observer = downstream;\n final SimplePlainQueue<U> q = queue;\n\n if (wip.get() == 0 && wip.compareAndSet(0, 1)) {\n if (q.isEmpty()) {\n accept(observer, value);\n if (leave(-1) == 0) {\n return;\n }\n } else {\n q.offer(value);\n }\n } else {\n q.offer(value);\n if (!enter()) {\n return;\n }\n }\n QueueDrainHelper.drainLoop(q, observer, delayError, disposable, this);\n }",
"path": "src/main/java/io/reactivex/internal/observers/QueueDrainObserver.java",
"repo": "ReactiveX/RxJava",
"sha": "ac84182aa2bd866b53e01c8e3fe99683b882c60e",
"url": "https://github.com/ReactiveX/RxJava/blob/ac84182aa2bd866b53e01c8e3fe99683b882c60e/src/main/java/io/reactivex/internal/observers/QueueDrainObserver.java#L88-L108"
}
```
#### javascript
An example of 'test' looks as follows.
```
{
"code": "function createInstance(defaultConfig) {\n var context = new Axios(defaultConfig);\n var instance = bind(Axios.prototype.request, context);\n\n // Copy axios.prototype to instance\n utils.extend(instance, Axios.prototype, context);\n\n // Copy context to instance\n utils.extend(instance, context);\n\n return instance;\n}",
"code_tokens": ["function", "createInstance", "(", "defaultConfig", ")", "{", "var", "context", "=", "new", "Axios", "(", "defaultConfig", ")", ";", "var", "instance", "=", "bind", "(", "Axios", ".", "prototype", ".", "request", ",", "context", ")", ";", "// Copy axios.prototype to instance", "utils", ".", "extend", "(", "instance", ",", "Axios", ".", "prototype", ",", "context", ")", ";", "// Copy context to instance", "utils", ".", "extend", "(", "instance", ",", "context", ")", ";", "return", "instance", ";", "}"],
"docstring": "Create an instance of Axios\n\n@param {Object} defaultConfig The default config for the instance\n@return {Axios} A new instance of Axios",
"docstring_tokens": ["Create", "an", "instance", "of", "Axios"],
"func_name": "createInstance",
"id": 0,
"language": "javascript",
"original_string": "function createInstance(defaultConfig) {\n var context = new Axios(defaultConfig);\n var instance = bind(Axios.prototype.request, context);\n\n // Copy axios.prototype to instance\n utils.extend(instance, Axios.prototype, context);\n\n // Copy context to instance\n utils.extend(instance, context);\n\n return instance;\n}",
"path": "lib/axios.js",
"repo": "axios/axios",
"sha": "92d231387fe2092f8736bc1746d4caa766b675f5",
"url": "https://github.com/axios/axios/blob/92d231387fe2092f8736bc1746d4caa766b675f5/lib/axios.js#L15-L26"
}
```
#### php
An example of 'train' looks as follows.
```
{
"code": "public static function build($serviceAddress, $restConfigPath, array $config = [])\n {\n $config += [\n 'httpHandler' => null,\n ];\n list($baseUri, $port) = self::normalizeServiceAddress($serviceAddress);\n $requestBuilder = new RequestBuilder(\"$baseUri:$port\", $restConfigPath);\n $httpHandler = $config['httpHandler'] ?: self::buildHttpHandlerAsync();\n return new RestTransport($requestBuilder, $httpHandler);\n }",
"code_tokens": ["public", "static", "function", "build", "(", "$", "serviceAddress", ",", "$", "restConfigPath", ",", "array", "$", "config", "=", "[", "]", ")", "{", "$", "config", "+=", "[", "'httpHandler'", "=>", "null", ",", "]", ";", "list", "(", "$", "baseUri", ",", "$", "port", ")", "=", "self", "::", "normalizeServiceAddress", "(", "$", "serviceAddress", ")", ";", "$", "requestBuilder", "=", "new", "RequestBuilder", "(", "\"$baseUri:$port\"", ",", "$", "restConfigPath", ")", ";", "$", "httpHandler", "=", "$", "config", "[", "'httpHandler'", "]", "?", ":", "self", "::", "buildHttpHandlerAsync", "(", ")", ";", "return", "new", "RestTransport", "(", "$", "requestBuilder", ",", "$", "httpHandler", ")", ";", "}"],
"docstring": "Builds a RestTransport.\n\n@param string $serviceAddress\nThe address of the API remote host, for example \"example.googleapis.com\".\n@param string $restConfigPath\nPath to rest config file.\n@param array $config {\nConfig options used to construct the gRPC transport.\n\n@type callable $httpHandler A handler used to deliver PSR-7 requests.\n}\n@return RestTransport\n@throws ValidationException",
"docstring_tokens": ["Builds", "a", "RestTransport", "."],
"func_name": "RestTransport.build",
"id": 0,
"language": "php",
"original_string": "public static function build($serviceAddress, $restConfigPath, array $config = [])\n {\n $config += [\n 'httpHandler' => null,\n ];\n list($baseUri, $port) = self::normalizeServiceAddress($serviceAddress);\n $requestBuilder = new RequestBuilder(\"$baseUri:$port\", $restConfigPath);\n $httpHandler = $config['httpHandler'] ?: self::buildHttpHandlerAsync();\n return new RestTransport($requestBuilder, $httpHandler);\n }",
"path": "src/Transport/RestTransport.php",
"repo": "googleapis/gax-php",
"sha": "48387fb818c6882296710a2302a0aa973b99afb2",
"url": "https://github.com/googleapis/gax-php/blob/48387fb818c6882296710a2302a0aa973b99afb2/src/Transport/RestTransport.php#L85-L94"
}
```
#### python
An example of 'validation' looks as follows.
```
{
"code": "def save_act(self, path=None):\n \"\"\"Save model to a pickle located at `path`\"\"\"\n if path is None:\n path = os.path.join(logger.get_dir(), \"model.pkl\")\n\n with tempfile.TemporaryDirectory() as td:\n save_variables(os.path.join(td, \"model\"))\n arc_name = os.path.join(td, \"packed.zip\")\n with zipfile.ZipFile(arc_name, 'w') as zipf:\n for root, dirs, files in os.walk(td):\n for fname in files:\n file_path = os.path.join(root, fname)\n if file_path != arc_name:\n zipf.write(file_path, os.path.relpath(file_path, td))\n with open(arc_name, \"rb\") as f:\n model_data = f.read()\n with open(path, \"wb\") as f:\n cloudpickle.dump((model_data, self._act_params), f)",
"code_tokens": ["def", "save_act", "(", "self", ",", "path", "=", "None", ")", ":", "if", "path", "is", "None", ":", "path", "=", "os", ".", "path", ".", "join", "(", "logger", ".", "get_dir", "(", ")", ",", "\"model.pkl\"", ")", "with", "tempfile", ".", "TemporaryDirectory", "(", ")", "as", "td", ":", "save_variables", "(", "os", ".", "path", ".", "join", "(", "td", ",", "\"model\"", ")", ")", "arc_name", "=", "os", ".", "path", ".", "join", "(", "td", ",", "\"packed.zip\"", ")", "with", "zipfile", ".", "ZipFile", "(", "arc_name", ",", "'w'", ")", "as", "zipf", ":", "for", "root", ",", "dirs", ",", "files", "in", "os", ".", "walk", "(", "td", ")", ":", "for", "fname", "in", "files", ":", "file_path", "=", "os", ".", "path", ".", "join", "(", "root", ",", "fname", ")", "if", "file_path", "!=", "arc_name", ":", "zipf", ".", "write", "(", "file_path", ",", "os", ".", "path", ".", "relpath", "(", "file_path", ",", "td", ")", ")", "with", "open", "(", "arc_name", ",", "\"rb\"", ")", "as", "f", ":", "model_data", "=", "f", ".", "read", "(", ")", "with", "open", "(", "path", ",", "\"wb\"", ")", "as", "f", ":", "cloudpickle", ".", "dump", "(", "(", "model_data", ",", "self", ".", "_act_params", ")", ",", "f", ")"],
"docstring": "Save model to a pickle located at `path`",
"docstring_tokens": ["Save", "model", "to", "a", "pickle", "located", "at", "path"],
"func_name": "ActWrapper.save_act",
"id": 0,
"language": "python",
"original_string": "def save_act(self, path=None):\n \"\"\"Save model to a pickle located at `path`\"\"\"\n if path is None:\n path = os.path.join(logger.get_dir(), \"model.pkl\")\n\n with tempfile.TemporaryDirectory() as td:\n save_variables(os.path.join(td, \"model\"))\n arc_name = os.path.join(td, \"packed.zip\")\n with zipfile.ZipFile(arc_name, 'w') as zipf:\n for root, dirs, files in os.walk(td):\n for fname in files:\n file_path = os.path.join(root, fname)\n if file_path != arc_name:\n zipf.write(file_path, os.path.relpath(file_path, td))\n with open(arc_name, \"rb\") as f:\n model_data = f.read()\n with open(path, \"wb\") as f:\n cloudpickle.dump((model_data, self._act_params), f)",
"path": "baselines/deepq/deepq.py",
"repo": "openai/baselines",
"sha": "3301089b48c42b87b396e246ea3f56fa4bfc9678",
"url": "https://github.com/openai/baselines/blob/3301089b48c42b87b396e246ea3f56fa4bfc9678/baselines/deepq/deepq.py#L55-L72"
}
```
#### ruby
An example of 'train' looks as follows.
```
{
"code": "def render_body(context, options)\n if options.key?(:partial)\n [render_partial(context, options)]\n else\n StreamingTemplateRenderer.new(@lookup_context).render(context, options)\n end\n end",
"code_tokens": ["def", "render_body", "(", "context", ",", "options", ")", "if", "options", ".", "key?", "(", ":partial", ")", "[", "render_partial", "(", "context", ",", "options", ")", "]", "else", "StreamingTemplateRenderer", ".", "new", "(", "@lookup_context", ")", ".", "render", "(", "context", ",", "options", ")", "end", "end"],
"docstring": "Render but returns a valid Rack body. If fibers are defined, we return\n a streaming body that renders the template piece by piece.\n\n Note that partials are not supported to be rendered with streaming,\n so in such cases, we just wrap them in an array.",
"docstring_tokens": ["Render", "but", "returns", "a", "valid", "Rack", "body", ".", "If", "fibers", "are", "defined", "we", "return", "a", "streaming", "body", "that", "renders", "the", "template", "piece", "by", "piece", "."],
"func_name": "ActionView.Renderer.render_body",
"id": 0,
"language": "ruby",
"original_string": "def render_body(context, options)\n if options.key?(:partial)\n [render_partial(context, options)]\n else\n StreamingTemplateRenderer.new(@lookup_context).render(context, options)\n end\n end",
"path": "actionview/lib/action_view/renderer/renderer.rb",
"repo": "rails/rails",
"sha": "85a8bc644be69908f05740a5886ec19cd3679df5",
"url": "https://github.com/rails/rails/blob/85a8bc644be69908f05740a5886ec19cd3679df5/actionview/lib/action_view/renderer/renderer.rb#L38-L44"
}
```
### Data Fields
In the following each data field in go is explained for each config. The data fields are the same among all splits.
#### go, java, javascript, php, python, ruby
| field name | type | description |
|----------------|----------------|-----------------------------------------------------------------------------------|
|id |int32 | Index of the sample |
|repo |string | repo: the owner/repo |
|path |string | path: the full path to the original file |
|func_name |string | func_name: the function or method name |
|original_string |string | original_string: the raw string before tokenization or parsing |
|language |string | language: the programming language name |
|code |string | code/function: the part of the original_string that is code |
|code_tokens |Sequence[string]| code_tokens/function_tokens: tokenized version of code |
|docstring |string | docstring: the top-level comment or docstring, if it exists in the original string|
|docstring_tokens|Sequence[string]| docstring_tokens: tokenized version of docstring |
|sha |string | sha of the file |
|url |string | url of the file |
### Data Splits
| name |train |validation|test |
|----------|-----:|---------:|----:|
|go |167288| 7325| 8122|
|java |164923| 5183|10955|
|javascript| 58025| 3885| 3291|
|php |241241| 12982|14014|
|python |251820| 13914|14918|
|ruby | 24927| 1400| 1261|
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
Data from CodeSearchNet Challenge dataset.
[More Information Needed]
#### Who are the source language producers?
Software Engineering developers.
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
https://github.com/microsoft, https://github.com/madlag
### Licensing Information
Computational Use of Data Agreement (C-UDA) License.
### Citation Information
```
@article{husain2019codesearchnet,
title={Codesearchnet challenge: Evaluating the state of semantic code search},
author={Husain, Hamel and Wu, Ho-Hsiang and Gazit, Tiferet and Allamanis, Miltiadis and Brockschmidt, Marc},
journal={arXiv preprint arXiv:1909.09436},
year={2019}
}
```
### Contributions
Thanks to @madlag (and partly also @ncoop57) for adding this dataset. |
ilhamxx/Receipt_data | ---
license: unknown
---
|
Telugu-LLM-Labs/telugu_teknium_GPTeacher_general_instruct_filtered_romanized | ---
license: mit
---
|
Amirjalaly/books_fegh | ---
dataset_info:
features:
- name: url
dtype: string
- name: language
dtype: string
- name: original_nlines
dtype: string
- name: part
dtype: string
- name: page
dtype: string
- name: nlines
dtype: string
- name: length
dtype: string
- name: title
dtype: string
- name: raw_content
dtype: string
- name: date_download
dtype: string
- name: language_score
dtype: string
- name: type
dtype: string
- name: perplexity
dtype: string
- name: original_length
dtype: string
- name: source_domain
dtype: string
splits:
- name: book_part1_fegh
num_bytes: 49004718
num_examples: 20000
- name: book_part2_fegh
num_bytes: 62933514
num_examples: 20000
- name: book_part3_fegh
num_bytes: 58078049
num_examples: 20000
- name: book_part4_fegh
num_bytes: 58591383
num_examples: 20000
- name: book_part5_fegh
num_bytes: 42504116
num_examples: 20000
- name: book_part6_fegh
num_bytes: 50998384
num_examples: 20000
- name: book_part7_fegh
num_bytes: 52735009
num_examples: 20000
- name: book_part8_fegh
num_bytes: 54972205
num_examples: 20000
- name: book_part9_fegh
num_bytes: 65020286
num_examples: 20000
- name: book_part10_fegh
num_bytes: 54380664
num_examples: 20000
- name: book_part11_fegh
num_bytes: 47427339
num_examples: 20000
- name: book_part12_fegh
num_bytes: 48398860
num_examples: 20000
- name: book_part13_fegh
num_bytes: 45573841
num_examples: 20000
- name: book_part14_fegh
num_bytes: 48445623
num_examples: 20000
- name: book_part15_fegh
num_bytes: 50559997
num_examples: 20000
- name: book_part16_fegh
num_bytes: 51662992
num_examples: 20000
- name: book_part17_fegh
num_bytes: 50755938
num_examples: 20000
- name: book_part18_fegh
num_bytes: 57893738
num_examples: 20000
- name: book_part19_fegh
num_bytes: 57818764
num_examples: 20000
- name: book_part20_fegh
num_bytes: 65119365
num_examples: 20000
- name: book_part21_fegh
num_bytes: 173500719
num_examples: 20000
- name: book_part22_fegh
num_bytes: 53707115
num_examples: 20000
- name: book_part23_fegh
num_bytes: 50702659
num_examples: 20000
- name: book_part24_fegh
num_bytes: 55158664
num_examples: 20000
- name: book_part25_fegh
num_bytes: 50015458
num_examples: 20000
- name: book_part26_fegh
num_bytes: 38386325
num_examples: 13982
download_size: 647726723
dataset_size: 1494345725
configs:
- config_name: default
data_files:
- split: book_part1_fegh
path: data/book_part1_fegh-*
- split: book_part2_fegh
path: data/book_part2_fegh-*
- split: book_part3_fegh
path: data/book_part3_fegh-*
- split: book_part4_fegh
path: data/book_part4_fegh-*
- split: book_part5_fegh
path: data/book_part5_fegh-*
- split: book_part6_fegh
path: data/book_part6_fegh-*
- split: book_part7_fegh
path: data/book_part7_fegh-*
- split: book_part8_fegh
path: data/book_part8_fegh-*
- split: book_part9_fegh
path: data/book_part9_fegh-*
- split: book_part10_fegh
path: data/book_part10_fegh-*
- split: book_part11_fegh
path: data/book_part11_fegh-*
- split: book_part12_fegh
path: data/book_part12_fegh-*
- split: book_part13_fegh
path: data/book_part13_fegh-*
- split: book_part14_fegh
path: data/book_part14_fegh-*
- split: book_part15_fegh
path: data/book_part15_fegh-*
- split: book_part16_fegh
path: data/book_part16_fegh-*
- split: book_part17_fegh
path: data/book_part17_fegh-*
- split: book_part18_fegh
path: data/book_part18_fegh-*
- split: book_part19_fegh
path: data/book_part19_fegh-*
- split: book_part20_fegh
path: data/book_part20_fegh-*
- split: book_part21_fegh
path: data/book_part21_fegh-*
- split: book_part22_fegh
path: data/book_part22_fegh-*
- split: book_part23_fegh
path: data/book_part23_fegh-*
- split: book_part24_fegh
path: data/book_part24_fegh-*
- split: book_part25_fegh
path: data/book_part25_fegh-*
- split: book_part26_fegh
path: data/book_part26_fegh-*
---
|
Chu0113/hekk | ---
dataset_info:
- config_name: all
features:
- name: data_index_by_user
dtype: int32
- name: article
dtype: string
- name: answer
dtype: string
- name: question
dtype: string
- name: options
sequence: string
splits:
- name: train
num_bytes: 191129599
num_examples: 87866
- name: validation
num_bytes: 10507580
num_examples: 4887
- name: test
num_bytes: 10668488
num_examples: 4934
download_size: 46954865
dataset_size: 212305667
- config_name: middle
features:
- name: data_index_by_user
dtype: int32
- name: article
dtype: string
- name: answer
dtype: string
- name: question
dtype: string
- name: options
sequence: string
splits:
- name: train
num_bytes: 191129599
num_examples: 87866
- name: validation
num_bytes: 10507580
num_examples: 4887
- name: test
num_bytes: 10668488
num_examples: 4934
download_size: 46954865
dataset_size: 212305667
configs:
- config_name: all
data_files:
- split: train
path: all/train-*
- split: validation
path: all/validation-*
- split: test
path: all/test-*
- config_name: middle
data_files:
- split: train
path: middle/train-*
- split: validation
path: middle/validation-*
- split: test
path: middle/test-*
---
|
KBLab/sucx3_ner | ---
annotations_creators:
- expert-generated
language_creators:
- other
language:
- sv
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- unknown
source_datasets:
- original
task_categories:
- other
task_ids:
- named-entity-recognition
- part-of-speech
pretty_name: sucx3_ner
tags:
- structure-prediction
---
# Dataset Card for _SUCX 3.0 - NER_
## Dataset Description
- **Homepage:** [https://spraakbanken.gu.se/en/resources/suc3](https://spraakbanken.gu.se/en/resources/suc3)
- **Repository:** [https://github.com/kb-labb/sucx3_ner](https://github.com/kb-labb/sucx3_ner)
- **Paper:** [SUC 2.0 manual](http://spraakbanken.gu.se/parole/Docs/SUC2.0-manual.pdf)
- **Point of Contact:**
### Dataset Summary
The dataset is a conversion of the venerable SUC 3.0 dataset into the
huggingface ecosystem.
The original dataset does not contain an official train-dev-test split, which is
introduced here; the tag distribution for the NER tags between the three splits
is mostly the same.
The dataset has three different types of tagsets: manually annotated POS,
manually annotated NER, and automatically annotated NER.
For the automatically annotated NER tags, only sentences were chosen, where the
automatic and manual annotations would match (with their respective categories).
Additionally we provide remixes of the same data with some or all sentences
being lowercased.
### Supported Tasks and Leaderboards
- Part-of-Speech tagging
- Named-Entity-Recognition
### Languages
Swedish
## Dataset Structure
### Data Remixes
- `original_tags` contain the manual NER annotations
- `lower` the whole dataset uncased
- `lower_mix` some of the dataset uncased
- `lower_both` every instance both cased and uncased
- `simple_tags` contain the automatic NER annotations
- `lower` the whole dataset uncased
- `lower_mix` some of the dataset uncased
- `lower_both` every instance both cased and uncased
### Data Instances
For each instance, there is an `id`, with an optional `_lower` suffix to mark
that it has been modified, a `tokens` list of strings containing tokens, a
`pos_tags` list of strings containing POS-tags, and a `ner_tags` list of strings
containing NER-tags.
```json
{"id": "e24d782c-e2475603_lower",
"tokens": ["-", "dels", "har", "vi", "inget", "index", "att", "g\u00e5", "efter", ",", "vi", "kr\u00e4ver", "allts\u00e5", "ers\u00e4ttning", "i", "40-talets", "penningv\u00e4rde", "."],
"pos_tags": ["MID", "KN", "VB", "PN", "DT", "NN", "IE", "VB", "PP", "MID", "PN", "VB", "AB", "NN", "PP", "NN", "NN", "MAD"],
"ner_tags": ["O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O"]}
```
### Data Fields
- `id`: a string containing the sentence-id
- `tokens`: a list of strings containing the sentence's tokens
- `pos_tags`: a list of strings containing the tokens' POS annotations
- `ner_tags`: a list of strings containing the tokens' NER annotations
### Data Splits
| Dataset Split | Size Percentage of Total Dataset Size | Number of Instances for the Original Tags |
| ------------- | ------------------------------------- | ----------------------------------------- |
| train | 64% | 46\,026 |
| dev | 16% | 11\,506 |
| test | 20% | 14\,383 |
The `simple_tags` remix has fewer instances due to the requirement to match
tags.
## Dataset Creation
See the [original webpage](https://spraakbanken.gu.se/en/resources/suc3)
## Additional Information
### Dataset Curators
[Språkbanken](sb-info@svenska.gu.se)
### Licensing Information
CC BY 4.0 (attribution)
### Citation Information
[SUC 2.0 manual](http://spraakbanken.gu.se/parole/Docs/SUC2.0-manual.pdf)
### Contributions
Thanks to [@robinqrtz](https://github.com/robinqrtz) for adding this dataset.
|
zeio/pale | ---
language:
- en
license: apache-2.0
tags:
- gaming
annotation_creators:
- crowdsourced
language_creators:
- crowdsourced
pretty_name: pale
size_categories:
- 10K<n<100K
task_categories:
- text-generation
- text-classification
- automatic-speech-recognition
---
# Dataset card for pale
## Table of contents
- [Dataset description](#dataset-description)
- [Dataset summary](#dataset-summary)
- [Dataset structure](#dataset-structure)
- [Dataset instance](#dataset-instance)
- [Dataset fields](#dataset-fields)
## Dataset description
- **Homepage:** [pale homepage](https://huggingface.co/datasets/zeio/pale)
- **Repository:** [pale repository](https://huggingface.co/datasets/zeio/pale)
- **Point of contact:** [Zeio Nara](mailto:zeionara@gmail.com)
- **Dataset version:** `30.10.2023`
### Dataset summary
This dataset contains league of legends champions' quotes parsed from [fandom](https://leagueoflegends.fandom.com).
See dataset viewer at the [derivative repo](/datasets/zeio/auto-pale).
See dataset usage example [at google colab](https://cutt.ly/3wEKDUI9).
The dataset is available in the following configurations:
1. `vanilla` - all data pulled from the website without significant modifications apart from the web page structure parsing;
1. `quotes` - truncated version of the corpus, which does't contain sound effects;
1. `annotated` - an extended version of the full configuration with a couple of additional columns with labels;
1. `pulled` - same as vanilla, but sound files have been pulled from the website, and `source` column is replaced with `sound`.
## Dataset structure
### Data instance
An example of an entry from the dataset is given below:
```json
{
"header": "Attack",
"subheader": "Attacking",
"text": "Kindred: \"The masks of the Kindred seek you!\"",
"source": "https://static.wikia.nocookie.net/leagueoflegends/images/1/12/Kindred_Original_Passive_Mark_Enemy_6.ogg/revision/latest?cb=20221204121356",
"champion": "kindred"
}
```
### Data fields
Each dataset entry therefore consists of the following fields:
- `header` - main category of the text;
- `subheader` - secondary category of the text (none in some cases);
- `text` - text said by the champion or description of sound made by the champion;
- `source` - link to the audio file (only `vanilla` configuration);
- `champion` - name of the champion in lowercase;
- `quote` - binary field displaying whether corresponding text contains quote or not (only `annotated` configuration);
- `sound` - audio data for the entry (only `pulled` configuration).
|
BRAIN-TR/flag-dataset | ---
license: apache-2.0
language:
- tr
- en
pretty_name: Bayrak Veri Seti
size_categories:
- 10K<n<100K
---
Flag dataset, görüntü üzerinde bulunan bayrak resimlerini tanımak için oluşturulmuş bir verisetidir. Veriler google images üzerinden temin edilmiş ve düzenlenmiştir. Burada bulunan verilerin tüm hakkı brain-tr tarafından saklı tutulmaktadır.
Veri seti sadece flag sınıfından oluşmaktadır ve nesne algılama problemlerinde kullanılır. Veri setine ilişkin genel özellikler ise şu şekildedir:
- Veri seti yolo formatındadır. Fakat brain-tr reposunda yolo2Voc, yolo2Coco gibi dosyalar mevcuttur. Bunları kullanarak istediğiniz formata dönüştürebilirsiniz.
- Veri seti toplamda 29.375 kayıt içermektedir. |
Francesco/abdomen-mri | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': abdomen-MRI
'1': 0
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: abdomen-mri
tags:
- rf100
---
# Dataset Card for abdomen-mri
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/abdomen-mri
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
abdomen-mri
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/abdomen-mri
### Citation Information
```
@misc{ abdomen-mri,
title = { abdomen mri Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/abdomen-mri } },
url = { https://universe.roboflow.com/object-detection/abdomen-mri },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
joey234/mmlu-high_school_world_history-neg-answer | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_answer
dtype: string
splits:
- name: test
num_bytes: 393405
num_examples: 237
download_size: 212102
dataset_size: 393405
---
# Dataset Card for "mmlu-high_school_world_history-neg-answer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
prompty/Furia | ---
license: gfdl
---
|
CyberHarem/pieri_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of pieri/ピエリ (Fire Emblem)
This is the dataset of pieri/ピエリ (Fire Emblem), containing 194 images and their tags.
The core tags of this character are `blue_hair, multicolored_hair, hair_over_one_eye, pink_hair, twintails, breasts, two-tone_hair, red_eyes, gradient_hair, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 194 | 216.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pieri_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 194 | 127.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pieri_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 427 | 258.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pieri_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 194 | 190.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pieri_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 427 | 357.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pieri_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/pieri_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, simple_background, solo, armor, white_background, smile, looking_at_viewer, sword, upper_body |
| 1 | 10 |  |  |  |  |  | 1girl, armor, solo, spear, open_mouth |
| 2 | 14 |  |  |  |  |  | 1girl, solo, nipples, nude, pussy, smile, blush, looking_at_viewer, uncensored, navel |
| 3 | 7 |  |  |  |  |  | blush, nipples, nude, solo_focus, 1boy, 1girl, cum_on_breasts, hetero, smile, cum_on_hair, facial, paizuri, penis, censored, closed_eyes, collarbone, long_hair, open_mouth |
| 4 | 7 |  |  |  |  |  | 1boy, 1girl, hetero, nipples, solo_focus, open_mouth, blush, pink_eyes, ahegao, completely_nude, medium_breasts, sex_from_behind, simple_background, tongue_out, arm_grab, arm_held_back, navel, standing_sex, vaginal |
| 5 | 9 |  |  |  |  |  | 1boy, fellatio, hetero, penis, 1girl, solo_focus, uncensored, nude, blush, english_text, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | simple_background | solo | armor | white_background | smile | looking_at_viewer | sword | upper_body | spear | open_mouth | nipples | nude | pussy | blush | uncensored | navel | solo_focus | 1boy | cum_on_breasts | hetero | cum_on_hair | facial | paizuri | penis | censored | closed_eyes | collarbone | long_hair | pink_eyes | ahegao | completely_nude | medium_breasts | sex_from_behind | tongue_out | arm_grab | arm_held_back | standing_sex | vaginal | fellatio | english_text |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:-------------------|:--------|:--------------------|:--------|:-------------|:--------|:-------------|:----------|:-------|:--------|:--------|:-------------|:--------|:-------------|:-------|:-----------------|:---------|:--------------|:---------|:----------|:--------|:-----------|:--------------|:-------------|:------------|:------------|:---------|:------------------|:-----------------|:------------------|:-------------|:-----------|:----------------|:---------------|:----------|:-----------|:---------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | | X | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | | X | | | X | X | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | | | | X | | | | | X | X | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | X | | | | | | | | | X | X | | | X | | X | X | X | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | |
| 5 | 9 |  |  |  |  |  | X | X | | | | | | | | | | | X | | X | X | | X | X | | X | | | | X | | | | | | | | | | | | | | | X | X |
|
one-sec-cv12/chunk_100 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 24332512176.875
num_examples: 253337
download_size: 22555624925
dataset_size: 24332512176.875
---
# Dataset Card for "chunk_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
miraclenugget/ayaka | ---
license: unknown
---
|
shuyuej/prompt_consistency_training_fewer | ---
license: apache-2.0
---
# 🚀 Load Dataset
```python
from datasets import load_dataset
dataset = load_dataset("shuyuej/prompt_consistency_training_fewer")
dataset = dataset["train"]
print(dataset)
```
|
akjindal53244/temp2 | ---
configs:
- config_name: default
data_files:
- split: train
path: train_dataset.json
- split: test
path: eval_dataset.json
license: apache-2.0
--- |
Atsushi/fungi_diagnostic_chars_comparison_japanese | ---
annotations_creators:
- other
language:
- ja
license:
- cc-by-4.0
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-class-classification
size_categories:
- 100K<n<1M
---
fungi_diagnostic_chars_comparison_japanese
大菌輪「識別形質まとめ」データセット
最終更新日:2024/2/23(R3-11457まで)
====
### Languages
Japanese
This dataset is available in Japanese only.
# 概要
Atsushi Nakajima(中島淳志)が個人で運営しているWebサイト[大菌輪](http://mycoscouter.coolblog.jp/daikinrin/) では、数千件以上の菌類分類学論文を「論文3行まとめ」という形で要約および索引付け(インデキシング)した情報を提供しています。
その一環として、ある菌と別の菌の「共通する」あるいは「異なる」識別形質 (diagnostic characters) に関する記述を人手で抽出しています。
本データセットは、抽出された識別形質の一覧に、「色/color」、「形状/shape」などのカテゴリを半自動的に付与して集積したものです。
「論文3行まとめ」は毎日更新していますが、本データセットの更新はおおむね1ヶ月に一度とする予定です。
## 関連データセット
「論文3行まとめ」
[Atsushi/fungi_indexed_mycological_papers_japanese](https://huggingface.co/datasets/Atsushi/fungi_indexed_mycological_papers_japanese)
「Trait Circusデータセット」(統制形質)
[Atsushi/fungi_trait_circus_database](https://huggingface.co/datasets/Atsushi/fungi_trait_circus_database)
## 各カラムの説明
* R3ID … 大菌輪「論文3行まとめ」のIDです。
* No … 各識別文を一意のIDで区別するために、各R3IDにおいてナンバリングしたものです。
* comparison_source … 比較元の分類群(学名)です。
* comparison_target … 比較先の分類群(学名)です。
* sentence … 識別文です。全て日本語です。
* label …半自動的に付与されたカテゴリです(人手で修正していますが、ダブルチェックは行っていないので誤分類もあると思います)。以下の25のカテゴリが存在します。
* サイズ/size
* 分子系統解析/molecular_phylogenetic_analysis
* 形状/shape
* 色/color
* 地理的分布/geographical_distribution
* 生息環境/habitat
* 表面性状/surface_characteristics
* 構造/structure
* 有無/presence
* 形態全般/general_morphology
* 位置/position
* 二次代謝産物/secondary_metabolite
* 呈色反応/chemical_reaction
* 数量/amount
* 発達/development
* 生理学的形質/physiological_characters
* 分類/classification
* 資化・発酵能/assimilation_and_fermentation
* 質感/texture
* 味・臭い/taste_and_smell
* 病害・病原性関連/disease_and_pathogenecity
* 全般/general_characters
* 耐性・感受性/resistance_and_susceptibility
* 栄養摂取様式/nutrition_style
* 未分類/unclassified
* common_or_different … 共通する形質は「1」、異なる形質は「0」です。
* data_source … 各情報の 出典(文献)のURLです。 |
atmallen/qm_bob_easy_2_grader_first_1.0e | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: int64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype:
class_label:
names:
'0': 'False'
'1': 'True'
splits:
- name: train
num_bytes: 10359818.0
num_examples: 117117
- name: validation
num_bytes: 1000602.0
num_examples: 11279
- name: test
num_bytes: 993048.0
num_examples: 11186
download_size: 2650402
dataset_size: 12353468.0
---
# Dataset Card for "qm_bob_easy_2_grader_first_1.0e"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Piyush2512/custom2 | ---
dataset_info:
features:
- name: audio_data
dtype: binary
- name: emotion
dtype: string
splits:
- name: train
num_bytes: 606005211
num_examples: 7442
download_size: 605589377
dataset_size: 606005211
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Jellywibble/dalio-reward-model-hackathon-dataset | ---
dataset_info:
features:
- name: input_text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 8765
num_examples: 16
download_size: 6055
dataset_size: 8765
---
# Dataset Card for "dalio-reward-model-hackathon-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
flagship/rice-aug_thermal-new_demo | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': RiceLeafs_BrownSpot
'1': RiceLeafs_Healthy
'2': RiceLeafs_Hispa
'3': RiceLeafs_LeafBlast
splits:
- name: train
num_bytes: 193534629.265
num_examples: 3731
- name: test
num_bytes: 944624.0
num_examples: 129
download_size: 188013508
dataset_size: 194479253.265
---
# Dataset Card for "rice-aug_thermal-new_demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
YAGO1818/Meauzin | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_Gille__StrangeMerges_52-7B-dare_ties | ---
pretty_name: Evaluation run of Gille/StrangeMerges_52-7B-dare_ties
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Gille/StrangeMerges_52-7B-dare_ties](https://huggingface.co/Gille/StrangeMerges_52-7B-dare_ties)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Gille__StrangeMerges_52-7B-dare_ties\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-02T22:53:52.409980](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_52-7B-dare_ties/blob/main/results_2024-04-02T22-53-52.409980.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6544195482791636,\n\
\ \"acc_stderr\": 0.031911351519696686,\n \"acc_norm\": 0.6538113078244907,\n\
\ \"acc_norm_stderr\": 0.032575369162425614,\n \"mc1\": 0.48714810281517745,\n\
\ \"mc1_stderr\": 0.01749771794429982,\n \"mc2\": 0.6575532617687332,\n\
\ \"mc2_stderr\": 0.015160045339993093\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6680887372013652,\n \"acc_stderr\": 0.01376098820088054,\n\
\ \"acc_norm\": 0.6902730375426621,\n \"acc_norm_stderr\": 0.013512058415238363\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.696673969328819,\n\
\ \"acc_stderr\": 0.004587553577101255,\n \"acc_norm\": 0.871539533957379,\n\
\ \"acc_norm_stderr\": 0.003339179835018285\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4365079365079365,\n \"acc_stderr\": 0.025542846817400496,\n \"\
acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.025542846817400496\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7838709677419354,\n \"acc_stderr\": 0.02341529343356852,\n \"\
acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.02341529343356852\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.0306365913486998,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.0306365913486998\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8390804597701149,\n\
\ \"acc_stderr\": 0.013140225515611724,\n \"acc_norm\": 0.8390804597701149,\n\
\ \"acc_norm_stderr\": 0.013140225515611724\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4223463687150838,\n\
\ \"acc_stderr\": 0.016519594275297117,\n \"acc_norm\": 0.4223463687150838,\n\
\ \"acc_norm_stderr\": 0.016519594275297117\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.024954184324879912,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.024954184324879912\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"\
acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n\
\ \"acc_stderr\": 0.012743072942653349,\n \"acc_norm\": 0.46740547588005216,\n\
\ \"acc_norm_stderr\": 0.012743072942653349\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083376,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083376\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48714810281517745,\n\
\ \"mc1_stderr\": 0.01749771794429982,\n \"mc2\": 0.6575532617687332,\n\
\ \"mc2_stderr\": 0.015160045339993093\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.819258089976322,\n \"acc_stderr\": 0.010814911009613988\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7225170583775588,\n \
\ \"acc_stderr\": 0.012333447581047537\n }\n}\n```"
repo_url: https://huggingface.co/Gille/StrangeMerges_52-7B-dare_ties
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|arc:challenge|25_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|gsm8k|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hellaswag|10_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T22-53-52.409980.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T22-53-52.409980.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- '**/details_harness|winogrande|5_2024-04-02T22-53-52.409980.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-02T22-53-52.409980.parquet'
- config_name: results
data_files:
- split: 2024_04_02T22_53_52.409980
path:
- results_2024-04-02T22-53-52.409980.parquet
- split: latest
path:
- results_2024-04-02T22-53-52.409980.parquet
---
# Dataset Card for Evaluation run of Gille/StrangeMerges_52-7B-dare_ties
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Gille/StrangeMerges_52-7B-dare_ties](https://huggingface.co/Gille/StrangeMerges_52-7B-dare_ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Gille__StrangeMerges_52-7B-dare_ties",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-02T22:53:52.409980](https://huggingface.co/datasets/open-llm-leaderboard/details_Gille__StrangeMerges_52-7B-dare_ties/blob/main/results_2024-04-02T22-53-52.409980.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6544195482791636,
"acc_stderr": 0.031911351519696686,
"acc_norm": 0.6538113078244907,
"acc_norm_stderr": 0.032575369162425614,
"mc1": 0.48714810281517745,
"mc1_stderr": 0.01749771794429982,
"mc2": 0.6575532617687332,
"mc2_stderr": 0.015160045339993093
},
"harness|arc:challenge|25": {
"acc": 0.6680887372013652,
"acc_stderr": 0.01376098820088054,
"acc_norm": 0.6902730375426621,
"acc_norm_stderr": 0.013512058415238363
},
"harness|hellaswag|10": {
"acc": 0.696673969328819,
"acc_stderr": 0.004587553577101255,
"acc_norm": 0.871539533957379,
"acc_norm_stderr": 0.003339179835018285
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.025542846817400496,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.025542846817400496
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083008,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083008
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.0306365913486998,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.0306365913486998
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.036028141763926456,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.036028141763926456
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8390804597701149,
"acc_stderr": 0.013140225515611724,
"acc_norm": 0.8390804597701149,
"acc_norm_stderr": 0.013140225515611724
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4223463687150838,
"acc_stderr": 0.016519594275297117,
"acc_norm": 0.4223463687150838,
"acc_norm_stderr": 0.016519594275297117
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.024954184324879912,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.024954184324879912
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817961,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817961
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653349,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653349
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083376,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083376
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.48714810281517745,
"mc1_stderr": 0.01749771794429982,
"mc2": 0.6575532617687332,
"mc2_stderr": 0.015160045339993093
},
"harness|winogrande|5": {
"acc": 0.819258089976322,
"acc_stderr": 0.010814911009613988
},
"harness|gsm8k|5": {
"acc": 0.7225170583775588,
"acc_stderr": 0.012333447581047537
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ctoraman/BilCat-news-classification | ---
license: cc
task_categories:
- text-classification
language:
- tr
tags:
- news-classification
- text-classification
- news-categorization
- text-categorization
- news-articles
size_categories:
- 1K<n<10K
---
BilCat: Bilkent Text Classification (News Categorization) Dataset
7540 Turkish news articles (Milliyet and TRT merged) with category labels (Dunya, Ekonomi, Politika, KulturSanat, Saglik, Spor, Turkiye, Yazarlar).
Column header is the first line.
Other details are at https://github.com/BilkentInformationRetrievalGroup/BilCat/
Citation:
C. Toraman, F. Can and S. Koçberber. Developing a text categorization template for Turkish news portals. 2011 International Symposium on Innovations in Intelligent Systems and Applications, Istanbul, 2011, pp. 379-383. DOI: 10.1109/INISTA.2011.5946096 |
bitaudit/audit_verification_dataset | ---
license: mit
tags:
- code
---
Contains labelled smart contracts |
Djarnis/huggingface-test-model | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: milestone
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: labels_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: description
dtype: string
- name: creator
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: open_issues
dtype: int64
- name: closed_issues
dtype: int64
- name: state
dtype: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: due_on
dtype: 'null'
- name: closed_at
dtype: 'null'
- name: comments
sequence: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: closed_at
dtype: timestamp[s]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: draft
dtype: bool
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: timestamp[s]
- name: body
dtype: string
- name: reactions
struct:
- name: url
dtype: string
- name: total_count
dtype: int64
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: laugh
dtype: int64
- name: hooray
dtype: int64
- name: confused
dtype: int64
- name: heart
dtype: int64
- name: rocket
dtype: int64
- name: eyes
dtype: int64
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 10676604
num_examples: 2000
download_size: 2886224
dataset_size: 10676604
---
# Dataset Card for "huggingface-test-model"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/blemishine_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of blemishine/ブレミシャイン/瑕光 (Arknights)
This is the dataset of blemishine/ブレミシャイン/瑕光 (Arknights), containing 190 images and their tags.
The core tags of this character are `animal_ears, long_hair, blonde_hair, horse_ears, animal_ear_fluff, horse_girl, ponytail, yellow_eyes, tail, horse_tail, bow, hair_bow, black_bow, extra_ears, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 190 | 388.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blemishine_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 190 | 316.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blemishine_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 485 | 629.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/blemishine_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/blemishine_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, closed_mouth, fur_trim, looking_at_viewer, simple_background, solo, white_background, breastplate, upper_body, smile, cape, plate_armor, sidelocks, black_gloves, blush |
| 1 | 11 |  |  |  |  |  | 1girl, cape, fur_trim, solo, breastplate, looking_at_viewer, holding_sword, simple_background, black_gloves, shield, smile, closed_mouth, cowboy_shot, orange_eyes, white_background, sidelocks |
| 2 | 11 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, solo, alternate_costume, large_breasts, cleavage, cowboy_shot, smile, simple_background, thighs, blush, collarbone, navel, stomach, white_background, bare_arms, black_bikini, black_leotard, outdoors, thigh_strap, very_long_hair |
| 3 | 7 |  |  |  |  |  | 1girl, blush, cum_in_pussy, 1boy, hetero, open_mouth, penis, solo_focus, tongue_out, large_breasts, navel, uncensored, after_sex, after_vaginal, collarbone, completely_nude, cumdrip, heart, indoors, looking_at_viewer, mosaic_censoring, nipples, smile, spread_legs, symbol-shaped_pupils, thighhighs, window |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | closed_mouth | fur_trim | looking_at_viewer | simple_background | solo | white_background | breastplate | upper_body | smile | cape | plate_armor | sidelocks | black_gloves | blush | holding_sword | shield | cowboy_shot | orange_eyes | bare_shoulders | alternate_costume | large_breasts | cleavage | thighs | collarbone | navel | stomach | bare_arms | black_bikini | black_leotard | outdoors | thigh_strap | very_long_hair | cum_in_pussy | 1boy | hetero | open_mouth | penis | solo_focus | tongue_out | uncensored | after_sex | after_vaginal | completely_nude | cumdrip | heart | indoors | mosaic_censoring | nipples | spread_legs | symbol-shaped_pupils | thighhighs | window |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-----------|:--------------------|:--------------------|:-------|:-------------------|:--------------|:-------------|:--------|:-------|:--------------|:------------|:---------------|:--------|:----------------|:---------|:--------------|:--------------|:-----------------|:--------------------|:----------------|:-----------|:---------|:-------------|:--------|:----------|:------------|:---------------|:----------------|:-----------|:--------------|:-----------------|:---------------|:-------|:---------|:-------------|:--------|:-------------|:-------------|:-------------|:------------|:----------------|:------------------|:----------|:--------|:----------|:-------------------|:----------|:--------------|:-----------------------|:-------------|:---------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | X | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | | | X | X | X | X | | | X | | | | | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | | X | | | | | | X | | | | | X | | | | | | | X | | | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
acylru/telegram | ---
task_categories:
- text-generation
language:
- en
- ar
- ru
tags:
- telegram
- channels
- dataset
pretty_name: Telegram's public channels
size_categories:
- 1M<n<10M
--- |
daze-unlv/medmcqa_axolotl | ---
license: apache-2.0
---
|
havli/kitana | ---
license: afl-3.0
---
|
open-llm-leaderboard/details_chlee10__T3Q-Merge-Mistral7B | ---
pretty_name: Evaluation run of chlee10/T3Q-Merge-Mistral7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chlee10/T3Q-Merge-Mistral7B](https://huggingface.co/chlee10/T3Q-Merge-Mistral7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chlee10__T3Q-Merge-Mistral7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-12T08:00:37.957911](https://huggingface.co/datasets/open-llm-leaderboard/details_chlee10__T3Q-Merge-Mistral7B/blob/main/results_2024-03-12T08-00-37.957911.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6510049020249649,\n\
\ \"acc_stderr\": 0.03204889030757393,\n \"acc_norm\": 0.6500060218814862,\n\
\ \"acc_norm_stderr\": 0.032724054630578966,\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7795928259236615,\n\
\ \"mc2_stderr\": 0.013705764896443729\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7141638225255973,\n \"acc_stderr\": 0.013203196088537372,\n\
\ \"acc_norm\": 0.7295221843003413,\n \"acc_norm_stderr\": 0.012980954547659556\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7171878111929895,\n\
\ \"acc_stderr\": 0.004494454911844619,\n \"acc_norm\": 0.8914558852818164,\n\
\ \"acc_norm_stderr\": 0.003104306434972473\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778398,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778398\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066485,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42569832402234636,\n\
\ \"acc_stderr\": 0.01653682964899711,\n \"acc_norm\": 0.42569832402234636,\n\
\ \"acc_norm_stderr\": 0.01653682964899711\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4758800521512386,\n\
\ \"acc_stderr\": 0.012755368722863935,\n \"acc_norm\": 0.4758800521512386,\n\
\ \"acc_norm_stderr\": 0.012755368722863935\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7795928259236615,\n\
\ \"mc2_stderr\": 0.013705764896443729\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8500394632991318,\n \"acc_stderr\": 0.010034394804580809\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7005307050796058,\n \
\ \"acc_stderr\": 0.012616300735519649\n }\n}\n```"
repo_url: https://huggingface.co/chlee10/T3Q-Merge-Mistral7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|arc:challenge|25_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|gsm8k|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hellaswag|10_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T08-00-37.957911.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-12T08-00-37.957911.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- '**/details_harness|winogrande|5_2024-03-12T08-00-37.957911.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-12T08-00-37.957911.parquet'
- config_name: results
data_files:
- split: 2024_03_12T08_00_37.957911
path:
- results_2024-03-12T08-00-37.957911.parquet
- split: latest
path:
- results_2024-03-12T08-00-37.957911.parquet
---
# Dataset Card for Evaluation run of chlee10/T3Q-Merge-Mistral7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [chlee10/T3Q-Merge-Mistral7B](https://huggingface.co/chlee10/T3Q-Merge-Mistral7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chlee10__T3Q-Merge-Mistral7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-12T08:00:37.957911](https://huggingface.co/datasets/open-llm-leaderboard/details_chlee10__T3Q-Merge-Mistral7B/blob/main/results_2024-03-12T08-00-37.957911.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6510049020249649,
"acc_stderr": 0.03204889030757393,
"acc_norm": 0.6500060218814862,
"acc_norm_stderr": 0.032724054630578966,
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7795928259236615,
"mc2_stderr": 0.013705764896443729
},
"harness|arc:challenge|25": {
"acc": 0.7141638225255973,
"acc_stderr": 0.013203196088537372,
"acc_norm": 0.7295221843003413,
"acc_norm_stderr": 0.012980954547659556
},
"harness|hellaswag|10": {
"acc": 0.7171878111929895,
"acc_stderr": 0.004494454911844619,
"acc_norm": 0.8914558852818164,
"acc_norm_stderr": 0.003104306434972473
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778398,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066485,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250447,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250447
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42569832402234636,
"acc_stderr": 0.01653682964899711,
"acc_norm": 0.42569832402234636,
"acc_norm_stderr": 0.01653682964899711
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4758800521512386,
"acc_stderr": 0.012755368722863935,
"acc_norm": 0.4758800521512386,
"acc_norm_stderr": 0.012755368722863935
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7795928259236615,
"mc2_stderr": 0.013705764896443729
},
"harness|winogrande|5": {
"acc": 0.8500394632991318,
"acc_stderr": 0.010034394804580809
},
"harness|gsm8k|5": {
"acc": 0.7005307050796058,
"acc_stderr": 0.012616300735519649
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
FarmerlineML/akan_dataset | ---
dataset_info:
features:
- name: transcription
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 3273722106.408
num_examples: 12143
- name: test
num_bytes: 196182968.0
num_examples: 824
download_size: 2593637030
dataset_size: 3469905074.408
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
jason-lee08/TinyStoriesExclamationValidation | ---
dataset_info:
features:
- name: validation
dtype: string
splits:
- name: train
num_bytes: 322761
num_examples: 405
download_size: 100666
dataset_size: 322761
---
# Dataset Card for "TinyStoriesExclamationValidation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xuchenhz/adl_recitation | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
--- |
CyberHarem/shirasaka_koume_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shirasaka_koume/白坂小梅 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of shirasaka_koume/白坂小梅 (THE iDOLM@STER: Cinderella Girls), containing 500 images and their tags.
The core tags of this character are `hair_over_one_eye, blonde_hair, short_hair, earrings, red_eyes, ear_piercing, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 560.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shirasaka_koume_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 333.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shirasaka_koume_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1147 | 687.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shirasaka_koume_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 502.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shirasaka_koume_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1147 | 957.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shirasaka_koume_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shirasaka_koume_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, choker, hair_flower, jewelry, solo, bags_under_eyes, hair_bow, skull, kimono, looking_at_viewer, ribbon, smile, bare_shoulders, blush, obi, dress, frills, lipstick, lolita_fashion, microphone, open_mouth |
| 1 | 8 |  |  |  |  |  | 1girl, bandages, bare_shoulders, detached_sleeves, sleeves_past_wrists, solo, bags_under_eyes, eyeball, looking_at_viewer, halloween, jack-o'-lantern, jewelry, blush, open_mouth, smile, spider_web_print, black_dress, ghost, ribbon, moon, pumpkin_hair_ornament |
| 2 | 14 |  |  |  |  |  | 1girl, hood_down, hoodie, looking_at_viewer, bags_under_eyes, solo, sleeves_past_fingers, jewelry, skirt, blush, open_mouth, simple_background |
| 3 | 7 |  |  |  |  |  | 1girl, hood_down, looking_at_viewer, piercing, simple_background, solo, white_background, blush, jewelry, sleeves_past_fingers, black_hoodie, blood, collarbone, long_sleeves, smile, closed_mouth, open_mouth, upper_body |
| 4 | 8 |  |  |  |  |  | 1girl, hood_down, long_sleeves, looking_at_viewer, plaid_skirt, pleated_skirt, sleeves_past_fingers, solo, blush, jewelry, piercing, red_skirt, smile, black_hoodie, blood, simple_background, white_background, black_pantyhose, closed_mouth |
| 5 | 12 |  |  |  |  |  | 1girl, bags_under_eyes, blush, looking_at_viewer, solo, apron, maid_headdress, open_mouth, sleeves_past_fingers, jewelry, skull, smile, enmaided |
| 6 | 17 |  |  |  |  |  | 1girl, solo, blush, jewelry, looking_at_viewer, smile, bags_under_eyes, dress, hairclip, open_mouth, piercing, skirt, skull, mini_top_hat, striped_thighhighs, microphone, sleeves_past_fingers |
| 7 | 7 |  |  |  |  |  | 1girl, blush, hat, looking_at_viewer, sleeves_past_wrists, solo, striped_sleeves, bags_under_eyes, choker, necklace, skirt, smile, bespectacled, ring, open_mouth, skull_print |
| 8 | 7 |  |  |  |  |  | 1girl, hair_bow, looking_at_viewer, smile, solo, blush, white_background, bare_shoulders, open_mouth, piercing, simple_background, bridal_gauntlets, nail_polish, necklace, purple_bow, purple_dress, ribbon, rose, skull, upper_body |
| 9 | 6 |  |  |  |  |  | 1girl, blush, piercing, pleated_skirt, white_shirt, bangs, red_neckerchief, sailor_collar, serafuku, short_sleeves, simple_background, solo, black_skirt, looking_at_viewer, thick_thighs, white_background, white_thighhighs, open_mouth |
| 10 | 7 |  |  |  |  |  | 1boy, 1girl, blush, hetero, shiny_hair, shiny_skin, black_hoodie, solo_focus, piercing, anus, ass_grab, deep_skin, from_behind, hood_down, looking_back, sleeves_past_wrists, thighhighs, bar_censor, black_panties, cum, nipples, oral, penis, red_skirt, small_breasts, thighs, white_panties |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | choker | hair_flower | jewelry | solo | bags_under_eyes | hair_bow | skull | kimono | looking_at_viewer | ribbon | smile | bare_shoulders | blush | obi | dress | frills | lipstick | lolita_fashion | microphone | open_mouth | bandages | detached_sleeves | sleeves_past_wrists | eyeball | halloween | jack-o'-lantern | spider_web_print | black_dress | ghost | moon | pumpkin_hair_ornament | hood_down | hoodie | sleeves_past_fingers | skirt | simple_background | piercing | white_background | black_hoodie | blood | collarbone | long_sleeves | closed_mouth | upper_body | plaid_skirt | pleated_skirt | red_skirt | black_pantyhose | apron | maid_headdress | enmaided | hairclip | mini_top_hat | striped_thighhighs | hat | striped_sleeves | necklace | bespectacled | ring | skull_print | bridal_gauntlets | nail_polish | purple_bow | purple_dress | rose | white_shirt | bangs | red_neckerchief | sailor_collar | serafuku | short_sleeves | black_skirt | thick_thighs | white_thighhighs | 1boy | hetero | shiny_hair | shiny_skin | solo_focus | anus | ass_grab | deep_skin | from_behind | looking_back | thighhighs | bar_censor | black_panties | cum | nipples | oral | penis | small_breasts | thighs | white_panties |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:---------|:--------------|:----------|:-------|:------------------|:-----------|:--------|:---------|:--------------------|:---------|:--------|:-----------------|:--------|:------|:--------|:---------|:-----------|:-----------------|:-------------|:-------------|:-----------|:-------------------|:----------------------|:----------|:------------|:------------------|:-------------------|:--------------|:--------|:-------|:------------------------|:------------|:---------|:-----------------------|:--------|:--------------------|:-----------|:-------------------|:---------------|:--------|:-------------|:---------------|:---------------|:-------------|:--------------|:----------------|:------------|:------------------|:--------|:-----------------|:-----------|:-----------|:---------------|:---------------------|:------|:------------------|:-----------|:---------------|:-------|:--------------|:-------------------|:--------------|:-------------|:---------------|:-------|:--------------|:--------|:------------------|:----------------|:-----------|:----------------|:--------------|:---------------|:-------------------|:-------|:---------|:-------------|:-------------|:-------------|:-------|:-----------|:------------|:--------------|:---------------|:-------------|:-------------|:----------------|:------|:----------|:-------|:--------|:----------------|:---------|:----------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | | | X | X | X | | | | X | X | X | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | | | X | X | X | | | | X | | | | X | | | | | | | X | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | | | X | X | | | | | X | | X | | X | | | | | | | X | | | | | | | | | | | | X | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | | | X | X | | | | | X | | X | | X | | | | | | | | | | | | | | | | | | | X | | X | | X | X | X | X | X | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 12 |  |  |  |  |  | X | | | X | X | X | | X | | X | | X | | X | | | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 17 |  |  |  |  |  | X | | | X | X | X | | X | | X | | X | | X | | X | | | | X | X | | | | | | | | | | | | | | X | X | | X | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | X | | | X | X | | | | X | | X | | X | | | | | | | X | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 7 |  |  |  |  |  | X | | | | X | | X | X | | X | X | X | X | X | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | | | | | | X | | | | | | | | | | | | | X | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 6 |  |  |  |  |  | X | | | | X | | | | | X | | | | X | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 10 | 7 |  |  |  |  |  | X | | | | | | | | | | | | | X | | | | | | | | | | X | | | | | | | | | X | | | | | X | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
parkervg/blendsql-test-dbs | ---
license: apache-2.0
---
|
tner/ttc_dummy | ---
language:
- en
license:
- other
multilinguality:
- monolingual
size_categories:
- 1k<10K
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: TTC
---
# Dataset Card for "tner/ttc" (Dummy)
***WARNING***: This is a dummy dataset for `ttc` and the correct one is [`tner/ttc`](https://huggingface.co/datasets/tner/ttc), which is private since **TTC dataset is not publicly released at this point**. We will grant you an access to the `tner/ttc` dataset, once you retained the original dataset from the authors (you need to send an inquiry to Shruti Rijhwani, `srijhwan@cs.cmu.edu`). See their repository for more detail [https://github.com/shrutirij/temporal-twitter-corpus](https://github.com/shrutirij/temporal-twitter-corpus).
Once you are granted access to the original TTC dataset by the author, please request the access at [here](https://huggingface.co/datasets/tner/ttc_dummy/discussions/1).
## Dataset Description
- **Repository:** [T-NER](https://github.com/asahi417/tner)
- **Paper:** [https://aclanthology.org/2020.acl-main.680/](https://aclanthology.org/2020.acl-main.680/)
- **Dataset:** Temporal Twitter Corpus
- **Domain:** Twitter
- **Number of Entity:** 3
### Dataset Summary
Broad Twitter Corpus NER dataset formatted in a part of [TNER](https://github.com/asahi417/tner) project.
- Entity Types: `LOC`, `ORG`, `PER`
## Dataset Structure
### Data Instances
An example of `train` looks as follows.
```
{
'tokens': ['😝', 'lemme', 'ask', '$MENTION$', ',', 'Timb', '???', '"', '$MENTION$', ':', '$RESERVED$', '!!!', '"', '$MENTION$', ':', '$MENTION$', 'Nezzzz', '!!', 'How', "'", 'bout', 'do', 'a', 'duet', 'with', '$MENTION$', '??!', ';)', '"'],
'tags': [6, 6, 6, 6, 6, 2, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6, 6]
}
```
### Label ID
The label2id dictionary can be found at [here](https://huggingface.co/datasets/tner/btc/raw/main/dataset/label.json).
```python
{
"B-LOC": 0,
"B-ORG": 1,
"B-PER": 2,
"I-LOC": 3,
"I-ORG": 4,
"I-PER": 5,
"O": 6
}
```
### Data Splits
| name |train|validation|test|
|---------|----:|---------:|---:|
|ttc | 9995| 500|1477|
### Citation Information
```
@inproceedings{rijhwani-preotiuc-pietro-2020-temporally,
title = "Temporally-Informed Analysis of Named Entity Recognition",
author = "Rijhwani, Shruti and
Preotiuc-Pietro, Daniel",
booktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.acl-main.680",
doi = "10.18653/v1/2020.acl-main.680",
pages = "7605--7617",
abstract = "Natural language processing models often have to make predictions on text data that evolves over time as a result of changes in language use or the information described in the text. However, evaluation results on existing data sets are seldom reported by taking the timestamp of the document into account. We analyze and propose methods that make better use of temporally-diverse training data, with a focus on the task of named entity recognition. To support these experiments, we introduce a novel data set of English tweets annotated with named entities. We empirically demonstrate the effect of temporal drift on performance, and how the temporal information of documents can be used to obtain better models compared to those that disregard temporal information. Our analysis gives insights into why this information is useful, in the hope of informing potential avenues of improvement for named entity recognition as well as other NLP tasks under similar experimental setups.",
}
``` |
open-llm-leaderboard/details_AA051611__limb | ---
pretty_name: Evaluation run of AA051611/limb
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AA051611/limb](https://huggingface.co/AA051611/limb) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051611__limb\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-14T17:31:13.154923](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__limb/blob/main/results_2024-01-14T17-31-13.154923.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7173948628205344,\n\
\ \"acc_stderr\": 0.029795425890422344,\n \"acc_norm\": 0.7228232912878558,\n\
\ \"acc_norm_stderr\": 0.030359217292974663,\n \"mc1\": 0.3990208078335373,\n\
\ \"mc1_stderr\": 0.017142825728496767,\n \"mc2\": 0.5836669238966421,\n\
\ \"mc2_stderr\": 0.01521191071011394\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5921501706484642,\n \"acc_stderr\": 0.014361097288449712,\n\
\ \"acc_norm\": 0.6348122866894198,\n \"acc_norm_stderr\": 0.0140702655192688\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6357299342760406,\n\
\ \"acc_stderr\": 0.0048024139199326675,\n \"acc_norm\": 0.8307110137422824,\n\
\ \"acc_norm_stderr\": 0.0037424055874098806\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n\
\ \"acc_stderr\": 0.04024778401977108,\n \"acc_norm\": 0.6814814814814815,\n\
\ \"acc_norm_stderr\": 0.04024778401977108\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n\
\ \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7849056603773585,\n \"acc_stderr\": 0.025288394502891363,\n\
\ \"acc_norm\": 0.7849056603773585,\n \"acc_norm_stderr\": 0.025288394502891363\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03309615177059007,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03309615177059007\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.0351494255126744,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.0351494255126744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287533,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287533\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.723404255319149,\n \"acc_stderr\": 0.029241883869628813,\n\
\ \"acc_norm\": 0.723404255319149,\n \"acc_norm_stderr\": 0.029241883869628813\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n\
\ \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n\
\ \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7034482758620689,\n \"acc_stderr\": 0.03806142687309992,\n\
\ \"acc_norm\": 0.7034482758620689,\n \"acc_norm_stderr\": 0.03806142687309992\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.626984126984127,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\"\
: 0.626984126984127,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8451612903225807,\n \"acc_stderr\": 0.020579287326583227,\n \"\
acc_norm\": 0.8451612903225807,\n \"acc_norm_stderr\": 0.020579287326583227\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5517241379310345,\n \"acc_stderr\": 0.034991131376767445,\n \"\
acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.034991131376767445\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\"\
: 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n\
\ \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9141414141414141,\n \"acc_stderr\": 0.019960225563172885,\n \"\
acc_norm\": 0.9141414141414141,\n \"acc_norm_stderr\": 0.019960225563172885\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078898,\n\
\ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078898\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.764102564102564,\n \"acc_stderr\": 0.021525965407408726,\n \
\ \"acc_norm\": 0.764102564102564,\n \"acc_norm_stderr\": 0.021525965407408726\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4111111111111111,\n \"acc_stderr\": 0.029999923508706682,\n \
\ \"acc_norm\": 0.4111111111111111,\n \"acc_norm_stderr\": 0.029999923508706682\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7899159663865546,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.7899159663865546,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.45695364238410596,\n \"acc_stderr\": 0.04067325174247443,\n \"\
acc_norm\": 0.45695364238410596,\n \"acc_norm_stderr\": 0.04067325174247443\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8880733944954129,\n \"acc_stderr\": 0.013517352714958792,\n \"\
acc_norm\": 0.8880733944954129,\n \"acc_norm_stderr\": 0.013517352714958792\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6342592592592593,\n \"acc_stderr\": 0.03284738857647206,\n \"\
acc_norm\": 0.6342592592592593,\n \"acc_norm_stderr\": 0.03284738857647206\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8921568627450981,\n \"acc_stderr\": 0.021770522281368394,\n \"\
acc_norm\": 0.8921568627450981,\n \"acc_norm_stderr\": 0.021770522281368394\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8818565400843882,\n \"acc_stderr\": 0.02101105265987846,\n \
\ \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.02101105265987846\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7399103139013453,\n\
\ \"acc_stderr\": 0.029442495585857476,\n \"acc_norm\": 0.7399103139013453,\n\
\ \"acc_norm_stderr\": 0.029442495585857476\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.032178294207446306,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.032178294207446306\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8842975206611571,\n \"acc_stderr\": 0.0291998024556228,\n \"acc_norm\"\
: 0.8842975206611571,\n \"acc_norm_stderr\": 0.0291998024556228\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\
\ \"acc_stderr\": 0.03520703990517963,\n \"acc_norm\": 0.8425925925925926,\n\
\ \"acc_norm_stderr\": 0.03520703990517963\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.03083349114628123,\n\
\ \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.03083349114628123\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n\
\ \"acc_stderr\": 0.04726835553719098,\n \"acc_norm\": 0.5446428571428571,\n\
\ \"acc_norm_stderr\": 0.04726835553719098\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8737864077669902,\n \"acc_stderr\": 0.03288180278808628,\n\
\ \"acc_norm\": 0.8737864077669902,\n \"acc_norm_stderr\": 0.03288180278808628\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n\
\ \"acc_stderr\": 0.017893784904018543,\n \"acc_norm\": 0.9188034188034188,\n\
\ \"acc_norm_stderr\": 0.017893784904018543\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263714,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263714\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.896551724137931,\n\
\ \"acc_stderr\": 0.0108904525446915,\n \"acc_norm\": 0.896551724137931,\n\
\ \"acc_norm_stderr\": 0.0108904525446915\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.02326752843210017,\n\
\ \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.02326752843210017\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42569832402234636,\n\
\ \"acc_stderr\": 0.016536829648997112,\n \"acc_norm\": 0.42569832402234636,\n\
\ \"acc_norm_stderr\": 0.016536829648997112\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8169934640522876,\n \"acc_stderr\": 0.022140767512880973,\n\
\ \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.022140767512880973\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7813504823151125,\n\
\ \"acc_stderr\": 0.02347558141786111,\n \"acc_norm\": 0.7813504823151125,\n\
\ \"acc_norm_stderr\": 0.02347558141786111\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8117283950617284,\n \"acc_stderr\": 0.02175186606081587,\n\
\ \"acc_norm\": 0.8117283950617284,\n \"acc_norm_stderr\": 0.02175186606081587\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5283687943262412,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.5283687943262412,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.529986962190352,\n\
\ \"acc_stderr\": 0.012747248967079058,\n \"acc_norm\": 0.529986962190352,\n\
\ \"acc_norm_stderr\": 0.012747248967079058\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.025767252010855946,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.025767252010855946\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7630718954248366,\n \"acc_stderr\": 0.01720166216978978,\n \
\ \"acc_norm\": 0.7630718954248366,\n \"acc_norm_stderr\": 0.01720166216978978\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7836734693877551,\n \"acc_stderr\": 0.026358916334904028,\n\
\ \"acc_norm\": 0.7836734693877551,\n \"acc_norm_stderr\": 0.026358916334904028\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.023729830881018526,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.023729830881018526\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.95,\n \"acc_stderr\": 0.021904291355759033,\n \
\ \"acc_norm\": 0.95,\n \"acc_norm_stderr\": 0.021904291355759033\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3990208078335373,\n\
\ \"mc1_stderr\": 0.017142825728496767,\n \"mc2\": 0.5836669238966421,\n\
\ \"mc2_stderr\": 0.01521191071011394\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.797947908445146,\n \"acc_stderr\": 0.01128501375404745\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.55420773313116,\n \
\ \"acc_stderr\": 0.013691305174506698\n }\n}\n```"
repo_url: https://huggingface.co/AA051611/limb
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|arc:challenge|25_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|gsm8k|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hellaswag|10_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T17-31-13.154923.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-14T17-31-13.154923.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- '**/details_harness|winogrande|5_2024-01-14T17-31-13.154923.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-14T17-31-13.154923.parquet'
- config_name: results
data_files:
- split: 2024_01_14T17_31_13.154923
path:
- results_2024-01-14T17-31-13.154923.parquet
- split: latest
path:
- results_2024-01-14T17-31-13.154923.parquet
---
# Dataset Card for Evaluation run of AA051611/limb
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051611/limb](https://huggingface.co/AA051611/limb) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051611__limb",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T17:31:13.154923](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051611__limb/blob/main/results_2024-01-14T17-31-13.154923.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7173948628205344,
"acc_stderr": 0.029795425890422344,
"acc_norm": 0.7228232912878558,
"acc_norm_stderr": 0.030359217292974663,
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496767,
"mc2": 0.5836669238966421,
"mc2_stderr": 0.01521191071011394
},
"harness|arc:challenge|25": {
"acc": 0.5921501706484642,
"acc_stderr": 0.014361097288449712,
"acc_norm": 0.6348122866894198,
"acc_norm_stderr": 0.0140702655192688
},
"harness|hellaswag|10": {
"acc": 0.6357299342760406,
"acc_stderr": 0.0048024139199326675,
"acc_norm": 0.8307110137422824,
"acc_norm_stderr": 0.0037424055874098806
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.04024778401977108,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.04024778401977108
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8157894736842105,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.8157894736842105,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7849056603773585,
"acc_stderr": 0.025288394502891363,
"acc_norm": 0.7849056603773585,
"acc_norm_stderr": 0.025288394502891363
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03309615177059007,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03309615177059007
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.45,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.45,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.0351494255126744,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.0351494255126744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287533,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287533
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.723404255319149,
"acc_stderr": 0.029241883869628813,
"acc_norm": 0.723404255319149,
"acc_norm_stderr": 0.029241883869628813
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7034482758620689,
"acc_stderr": 0.03806142687309992,
"acc_norm": 0.7034482758620689,
"acc_norm_stderr": 0.03806142687309992
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.626984126984127,
"acc_stderr": 0.02490699045899257,
"acc_norm": 0.626984126984127,
"acc_norm_stderr": 0.02490699045899257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8451612903225807,
"acc_stderr": 0.020579287326583227,
"acc_norm": 0.8451612903225807,
"acc_norm_stderr": 0.020579287326583227
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.034991131376767445,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.034991131376767445
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9141414141414141,
"acc_stderr": 0.019960225563172885,
"acc_norm": 0.9141414141414141,
"acc_norm_stderr": 0.019960225563172885
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.018088393839078898,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.018088393839078898
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.764102564102564,
"acc_stderr": 0.021525965407408726,
"acc_norm": 0.764102564102564,
"acc_norm_stderr": 0.021525965407408726
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4111111111111111,
"acc_stderr": 0.029999923508706682,
"acc_norm": 0.4111111111111111,
"acc_norm_stderr": 0.029999923508706682
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7899159663865546,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.7899159663865546,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.45695364238410596,
"acc_stderr": 0.04067325174247443,
"acc_norm": 0.45695364238410596,
"acc_norm_stderr": 0.04067325174247443
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8880733944954129,
"acc_stderr": 0.013517352714958792,
"acc_norm": 0.8880733944954129,
"acc_norm_stderr": 0.013517352714958792
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6342592592592593,
"acc_stderr": 0.03284738857647206,
"acc_norm": 0.6342592592592593,
"acc_norm_stderr": 0.03284738857647206
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8921568627450981,
"acc_stderr": 0.021770522281368394,
"acc_norm": 0.8921568627450981,
"acc_norm_stderr": 0.021770522281368394
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.02101105265987846,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.02101105265987846
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7399103139013453,
"acc_stderr": 0.029442495585857476,
"acc_norm": 0.7399103139013453,
"acc_norm_stderr": 0.029442495585857476
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.032178294207446306,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.032178294207446306
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.0291998024556228,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.0291998024556228
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517963,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517963
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.03083349114628123,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.03083349114628123
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719098,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719098
},
"harness|hendrycksTest-management|5": {
"acc": 0.8737864077669902,
"acc_stderr": 0.03288180278808628,
"acc_norm": 0.8737864077669902,
"acc_norm_stderr": 0.03288180278808628
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9188034188034188,
"acc_stderr": 0.017893784904018543,
"acc_norm": 0.9188034188034188,
"acc_norm_stderr": 0.017893784904018543
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263714,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263714
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.896551724137931,
"acc_stderr": 0.0108904525446915,
"acc_norm": 0.896551724137931,
"acc_norm_stderr": 0.0108904525446915
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.02326752843210017,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.02326752843210017
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42569832402234636,
"acc_stderr": 0.016536829648997112,
"acc_norm": 0.42569832402234636,
"acc_norm_stderr": 0.016536829648997112
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.022140767512880973,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.022140767512880973
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7813504823151125,
"acc_stderr": 0.02347558141786111,
"acc_norm": 0.7813504823151125,
"acc_norm_stderr": 0.02347558141786111
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8117283950617284,
"acc_stderr": 0.02175186606081587,
"acc_norm": 0.8117283950617284,
"acc_norm_stderr": 0.02175186606081587
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5283687943262412,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.5283687943262412,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.529986962190352,
"acc_stderr": 0.012747248967079058,
"acc_norm": 0.529986962190352,
"acc_norm_stderr": 0.012747248967079058
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.025767252010855946,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.025767252010855946
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7630718954248366,
"acc_stderr": 0.01720166216978978,
"acc_norm": 0.7630718954248366,
"acc_norm_stderr": 0.01720166216978978
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7836734693877551,
"acc_stderr": 0.026358916334904028,
"acc_norm": 0.7836734693877551,
"acc_norm_stderr": 0.026358916334904028
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.023729830881018526,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.023729830881018526
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.95,
"acc_stderr": 0.021904291355759033,
"acc_norm": 0.95,
"acc_norm_stderr": 0.021904291355759033
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496767,
"mc2": 0.5836669238966421,
"mc2_stderr": 0.01521191071011394
},
"harness|winogrande|5": {
"acc": 0.797947908445146,
"acc_stderr": 0.01128501375404745
},
"harness|gsm8k|5": {
"acc": 0.55420773313116,
"acc_stderr": 0.013691305174506698
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ethux/Dutch-GOV-Law-wetten.overheid.nl | ---
license: apache-2.0
language:
- nl
size_categories:
- 10K<n<100K
---
# Dutch GOV Laws
This dataset is created by scraping https://wetten.overheid.nl, I used the Sitemap to get all possible URLS.
It possible some URLS are missing, around 1% gave a 404 or 405 error.
The reason for creating this dataset is I couldn't find any other existing dataset with this data.
So here is this dataset, Enjoy!
### Please note this dataset is not complety checked or cleaned , this is a Work In Progress for me. I did go for easy. |
vwxyzjn/summarize_from_feedback_tldr_3_filtered_oai_preprocessing_1708444324 | ---
dataset_info:
features:
- name: id
dtype: string
- name: subreddit
dtype: string
- name: title
dtype: string
- name: post
dtype: string
- name: summary
dtype: string
- name: query_token
sequence: string
- name: query
dtype: string
- name: reference_response
dtype: string
- name: reference_response_token
sequence: int64
- name: reference_response_token_len
dtype: int64
- name: query_reference_response
dtype: string
- name: query_reference_response_token
sequence: int64
- name: query_reference_response_token_response_label
sequence: int64
- name: query_reference_response_token_len
dtype: int64
splits:
- name: train
num_bytes: 1646213234
num_examples: 116722
- name: validation
num_bytes: 91015850
num_examples: 6447
- name: test
num_bytes: 92609639
num_examples: 6553
download_size: 568625536
dataset_size: 1829838723
---
# TL;DR SFT Dataset for OpenAI's [Summarize from Feedback](https://openai.com/blog/summarization/) task
The dataset is directly taken from https://github.com/openai/summarize-from-feedback/tree/700967448d10004279f138666442bf1497d0e705#reddit-tldr-dataset
These columns are taken directly from the aforementioned dataset:
* **id**: unique identifier for the post
* **subreddit**: subreddit the post was taken from
* **title**: title of the post
* **post**: body of the post
* **summary**: summary of the post
* **reference_response**: reference response for the post
These columns are added by this preprocessing script:
* **query**: length-limited query for summarization: OAI pre-processes the main text (title + subreddit + post), ensuring it has only 512 tokens; if the main text is too long, then it tries to truncate at the last `
`. If it's too short it pads the main text ([summarize_from_feedback/tasks.py#L98-L165](https://github.com/openai/summarize-from-feedback/blob/700967448d10004279f138666442bf1497d0e705/summarize_from_feedback/tasks.py#L98-L165)). Padding is either space or `[PAD]` token (see Args below).
* **query_token**: tokenized version of `query`
* **reference_response_token**: tokenized version of `reference_response`
* **reference_response_token_len**: length of `reference_response_token`
* **query_reference_response**: concatenation of `query.strip()` and `reference_response`
* **query_reference_response_token**: tokenized version of `query_reference_response`, up to `max_sft_query_response_length` tokens
* **query_reference_response_token_len**: length of `query_reference_response_token`
# Args
```python
{'base_model': 'EleutherAI/pythia-1b-deduped',
'check_length_correctness': True,
'cnndm_params': TaskQueryHParams(length=1919,
format_str='Article:\n{article}\n\nTL;DR:\n',
truncate_field='article',
truncate_text='\n',
padding='pad_token',
pad_token=[50277],
pad_side='left',
max_sft_response_length=None,
max_sft_query_response_length=None,
max_rm_response_length=155,
max_rm_query_response_length=2021),
'debug': False,
'hf_entity': 'vwxyzjn',
'push_to_hub': True,
'tldr_params': TaskQueryHParams(length=512,
format_str='SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
truncate_field='post',
truncate_text='\n',
padding='pad_token',
pad_token=[50277],
pad_side='left',
max_sft_response_length=53,
max_sft_query_response_length=562,
max_rm_response_length=169,
max_rm_query_response_length=638)}
```
|
mtkinit/mtkinit_testing3 | ---
pretty_name: mtkinit/testing3
---
# mtkinit/testing3
Created from AIOD platform |
facebook/pmd | ---
annotations_creators:
- found
language_creators:
- found
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10M<n<100M
source_datasets:
- original
task_categories:
- image-to-text
task_ids:
- image-captioning
paperswithcode_id: pmd
pretty_name: PMD
extra_gated_prompt: |
By clicking on “Access repository” below, you also agree to individual licensing terms for each of the subset datasets of the PMD as noted at https://huggingface.co/datasets/facebook/pmd#additional-information.
---
# Dataset Card for PMD
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Dataset Preprocessing](#dataset-preprocessing)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Compared to original FLAVA paper](#compared-to-original-flava-paper)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [PMD homepage](https://flava-model.github.io/)
- **Repository:** [PMD repository](https://huggingface.co/datasets/facebook/pmd)
- **Paper:** [FLAVA: A Foundational Language And Vision Alignment Model
](https://arxiv.org/abs/2112.04482)
- **Leaderboard:**
- **Point of Contact:** [Amanpreet Singh](mailto:amanpreet@nyu.edu)
### Dataset Summary
Introduced in the FLAVA paper, Public Multimodal Dataset (PMD) is a collection of publicly-available image-text pair datasets. PMD contains 70M image-text pairs in total with 68M unique images. The dataset contains pairs from Conceptual Captions, Conceptual Captions 12M, WIT, Localized Narratives, RedCaps, COCO, SBU Captions, Visual Genome and a subset of YFCC100M dataset.
If you use PMD, please cite the original FLAVA paper as follows, along with the individual datasets (!! - see below for references):
```bibtex
@inproceedings{singh2022flava,
title={Flava: A foundational language and vision alignment model},
author={Singh, Amanpreet and Hu, Ronghang and Goswami, Vedanuj and Couairon, Guillaume and Galuba, Wojciech and Rohrbach, Marcus and Kiela, Douwe},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
pages={15638--15650},
year={2022}
}
```
You can load this dataset by first logging into Hugging Face using `huggingface-cli login` and then running the following commands:
```py
from datasets import load_dataset
pmd = load_dataset("facebook/pmd", use_auth_token=True)
```
You can also load the dataset in streaming mode if you don't want to download the big dataset files (> 50GB locally without the images):
```py
pmd = load_dataset("facebook/pmd", use_auth_token=True, streaming=True)
```
### Dataset Preprocessing
This dataset doesn't download all of the images locally by default. Instead, it exposes URLs for some of the images. To fetch the images, use the following code:
```python
from concurrent.futures import ThreadPoolExecutor
from functools import partial
import io
import urllib
import PIL.Image
from datasets import load_dataset
from datasets.utils.file_utils import get_datasets_user_agent
USER_AGENT = get_datasets_user_agent()
def fetch_single_image(image_data, timeout=None, retries=0):
image_url, image = image_data
if image is not None:
return image
for _ in range(retries + 1):
try:
request = urllib.request.Request(
image_url,
data=None,
headers={"user-agent": USER_AGENT},
)
with urllib.request.urlopen(request, timeout=timeout) as req:
image = PIL.Image.open(io.BytesIO(req.read()))
break
except Exception:
image = None
return image
def fetch_images(batch, num_threads, timeout=None, retries=0):
fetch_single_image_with_args = partial(fetch_single_image, timeout=timeout, retries=retries)
with ThreadPoolExecutor(max_workers=num_threads) as executor:
batch["image"] = list(executor.map(fetch_single_image_with_args, zip(batch["image_url"], batch["image"])))
return batch
num_threads = 20
dset = load_dataset("pmd", use_auth_token=True)
dset = dset.map(fetch_images, batched=True, batch_size=100, fn_kwargs={"num_threads": num_threads})
```
#### Save to disk
You can also save the dataset to disk for faster and direct loading next time but beware of the space required:
```py
dset.save_to_disk(</path/to/save>)
```
#### Load Subsets
You can also download a specific set from the PMD dataset by using
```py
dset = load_dataset("pmd", <choice>, use_auth_token=True)
```
The choices are `
```
"all","coco","sbu", "wit", "localized_narratives","conceptual_captions","visual_genome","conceptual_captions_12M","redcaps","yfcc100M_subset", "localized_narratives_openimages","localized_narratives_ade20k", "localized_narratives_coco"
```
#### Flickr30K Localized Narratives Subset
The Flickr30K subset of Localized Narratives is not included by default as it requires a manual download. You can include it by downloading the tar file from [here](http://shannon.cs.illinois.edu/DenotationGraph/data/index.html) after signing an agreement to `</path/to/Downloads>` and then loading it whole PMD or localized narratives subset by:
```py
dset = load_dataset("pmd", data_dir=</path/to/Downloads/flickr30k-images.tar.gz>, use_auth_token=True, use_flickr30k_ln=True)
# Load LN subset only
dset = load_dataset("pmd", "localized_narratives", data_dir=</path/to/Downloads/flickr30k-images.tar.gz>, use_auth_token=True, use_flickr30k_ln=True)
```
#### Facing issues?
If you are facing issues, you can try loading a specific revision of the repo by using:
```py
dset = load_dataset("pmd", use_auth_token=True, revision="311cd48")
```
### Supported Tasks and Leaderboards
In the FLAVA paper, the dataset has been used to pretrain the FLAVA model as a source of well-aligned image-text pairs. This allows having a generic vision-and-language model which can be fine-tuned for a variety of tasks.
We anticipate that the dataset can be used to train deep neural networks that perform image captioning and that learn transferable visual representations for a variety of downstream visual recognition tasks (image classification, object detection, instance segmentation). We also anticipate that the dataset could be used for a variety of vision-and-language (V&L) tasks, such as image or text retrieval or text-to-image synthesis.
### Languages
All of the subsets in PMD use English as their primary language.
## Dataset Structure
### Data Instances
Each instance in PMD represents a single image-text pair:
```
{
'image_url': None,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x480 at 0x7FCFF86A1E80>,
'text': 'A woman wearing a net on her head cutting a cake. ',
'source': 'coco',
'meta': '{\n "annotation": [\n "A woman wearing a net on her head cutting a cake. "\n ],\n "image_path": "zip:/val2014/COCO_val2014_000000522418.jpg::http:/images.cocodataset.org/zips/val2014.zip"\n}'
}
```
### Data Fields
- `image_url`: Static URL for downloading the image associated with the text. Can be `None` if image is locally available.
- `image`: A PIL Image object for the image associated with the text. Can be `None` if image is not locally available.
- `text`: `str`, A textual description corresponding to the image.
- `source`: `str`, The PMD subset which this pair is from.
- `meta`: `str`, A json representation of the original annotation from the dataset.
### Data Splits
All the data is contained in the training set. The training set has nearly 70M instances.
We intend for this dataset to be primarily used for pre-training with one or more specific downstream task(s) in mind. Thus, all of the instances should be used for pretraining. If required, we specifically make sure that there is no overlap with Karpathy's COCO validation set so users can use that subset for any validation purposes. Users can also load Karpathy's val subset by specifying the "validation" split while loading PMD. This will also load other "validation" splits for some subsets, if they are available.
## Dataset Creation
### Curation Rationale
From the paper:
> Purely contrastive methods, however, also have important shortcomings. Their cross-modal nature does not make them easily usable on multimodal problems that require dealing with both modalities at the same time. They require large corpora, which for both CLIP and ALIGN have not been made accessible to the research community and the details of which remain shrouded in mystery, notwithstanding well-known issues with the construction of such datasets
### Source Data
#### Initial Data Collection and Normalization
From the paper:
> **Data Collection Pipeline**
- For the YFCC100M dataset, we filter the image-text data by discarding non-English captions and only keeping captions that contain more than two words from the description field of each image, if this does not pass our filters we consider the title field. Other than that, we did not do any additional filtering.
- For the VisualGenome, COCO and Localized Narratives subsets, we remove any overlaps with Karpathy's COCO val and test sets.
- For Localized Narratives, we split the original caption which is a paragraph into multiple captions by using spaCy library and take the cartesan product leading to each sample as a separate image-text pair.
#### Compared to original FLAVA paper
The PMD dataset in this repo doesn't correspond 1:1 exactly to the original PMD dataset used in the [FLAVA](https://arxiv.org/abs/2112.04482) paper though this repo is built by the same authors. This is due to difficulty in reproducing WiT and YFCC100M subsets exactly. This repo in general contains more data than the PMD in the FLAVA paper and hence should probably result in better performance.
#### Who are the source language producers?
Please refer to the original dataset papers to understand where the content is coming from.
### Annotations
#### Annotation process
The dataset is a combination of existing public datasets with some filtering applied on top so there is no annotation process involved.
#### Who are the annotators?
Please refer to the original dataset papers to understand where the content is coming from.
### Personal and Sensitive Information
Please refer to the original dataset papers to understand where the content is coming from. For example, a detailed description on this for RedCaps can be found [here](https://huggingface.co/datasets/red_caps).
## Considerations for Using the Data
### Social Impact of Dataset
From the paper:
> **Has an analysis of the potential impact of the dataset and its use on data subjects (e.g.,
a data protection impact analysis) been conducted?**
No.
### Discussion of Biases
Please refer to the original dataset papers to understand where the content is coming from. For example, a detailed description on this for RedCaps can be found [here](https://huggingface.co/datasets/red_caps).
### Other Known Limitations
From the paper:
> **Are there any errors, sources of noise, or redundancies in the dataset?**
PMD is noisy by design since image-text pairs on the internet are noisy and unstructured. Though, since it contains sources such as COCO, Visual Genome, and Localized Narratives which are hand-curated by annotators, it has a lot of well-aligned data as well. So, it is definitely more aligned compared to e.g. LAION.
Some instances may also have duplicate images and captions but should have almost no effect in training large-scale models.
> **Does the dataset contain data that might be considered confidential (e.g., data that is
protected by legal privilege or by doctor-patient confidentiality, data that includes the
content of individuals non-public communications)?**
Not that the authors know of. Please refer to the original dataset papers to understand where the content is coming from. For example, a detailed description on this for RedCaps can be found [here](https://huggingface.co/datasets/red_caps).
## Additional Information
### Dataset Curators
The authors of the original dataset papers, as well as the authors of the FLAVA paper (Amanpreet, Ronghang, Vedanuj, Guillaume, Wojciech, Marcus and Douwe).
### Licensing Information
Here are the individual licenses from each of the datasets that apply if you use this dataset:
#### COCO
The annotations in the COCO dataset belong to the COCO Consortium and are licensed under a Creative Commons Attribution 4.0 License.
The COCO Consortium does not own the copyright of the images. Use of the images must abide by the Flickr Terms of Use. The users of the images accept full responsibility for the use of the dataset, including but not limited to the use of any copies of copyrighted images that they may create from the dataset.
#### Conceptual Captions
The dataset may be freely used for any purpose, although acknowledgement of Google LLC ("Google") as the data source would be appreciated. The dataset is provided "AS IS" without any warranty, express or implied. Google disclaims all liability for any damages, direct or indirect, resulting from the use of the dataset.
#### WIT
This data is available under the [Creative Commons Attribution-ShareAlike 3.0 Unported](LICENSE) license.
#### Visual Genome
Visual Genome by Ranjay Krishna et al is licensed under a Creative Commons Attribution 4.0 International License.
#### Localized Narratives
All the annotations available through this website are released under a [CC BY 4.0](https://creativecommons.org/licenses/by/4.0/) license. You are free to redistribute and modify the annotations, but we ask you to please keep the original attribution to our paper.
#### YFCC100M
Use of the original media files is subject to the Creative Commons licenses chosen by their creators/uploaders. License information for each media file can be found within [the YFCC100M metadata](https://multimediacommons.wordpress.com/yfcc100m-core-dataset/#yfcc100m). Use of the dataset is subject to the relevant Webscope License Agreement, which you need to agree to if you use this dataset.
#### RedCaps
The image metadata is licensed under CC-BY 4.0 license. Additionally, uses of this dataset are subject to Reddit API terms (https://www.reddit.com/wiki/
api-terms) and users must comply with Reddit User Agreeement, Content Policy,
and Privacy Policy – all accessible at https://www.redditinc.com/policies.
Similar to RedCaps:
> PMD should only be used for non-commercial research. PMD should not be used for any tasks that involve identifying features related to people (facial recognition, gender, age, ethnicity identification, etc.) or make decisions that impact people (mortgages, job applications, criminal sentences; or moderation decisions about user-uploaded data that could result in bans from a website). Any commercial and for-profit uses of PMD are restricted – it should not be used to train models that will be deployed in production systems as part of a product offered by businesses or government agencies.
### Citation Information
Please cite the main FLAVA paper in which PMD was introduced along with each of the subsets used in PMD as follows:
```bibtex
@inproceedings{singh2022flava,
title={Flava: A foundational language and vision alignment model},
author={Singh, Amanpreet and Hu, Ronghang and Goswami, Vedanuj and Couairon, Guillaume and Galuba, Wojciech and Rohrbach, Marcus and Kiela, Douwe},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
pages={15638--15650},
year={2022}
}
@article{chen2015microsoft,
title={Microsoft coco captions: Data collection and evaluation server},
author={Chen, Xinlei and Fang, Hao and Lin, Tsung-Yi and Vedantam, Ramakrishna and Gupta, Saurabh and Doll{\'a}r, Piotr and Zitnick, C Lawrence},
journal={arXiv preprint arXiv:1504.00325},
year={2015}
}
@inproceedings{ordonez2011sbucaptions,
Author = {Vicente Ordonez and Girish Kulkarni and Tamara L. Berg},
Title = {Im2Text: Describing Images Using 1 Million Captioned Photographs},
Booktitle = {Neural Information Processing Systems ({NIPS})},
Year = {2011},
}
@article{krishna2017visual,
title={Visual genome: Connecting language and vision using crowdsourced dense image annotations},
author={Krishna, Ranjay and Zhu, Yuke and Groth, Oliver and Johnson, Justin and Hata, Kenji and Kravitz, Joshua and Chen, Stephanie and Kalantidis, Yannis and Li, Li-Jia and Shamma, David A and others},
journal={International journal of computer vision},
volume={123},
number={1},
pages={32--73},
year={2017},
publisher={Springer}
}
@article{srinivasan2021wit,
title={WIT: Wikipedia-based Image Text Dataset for Multimodal Multilingual Machine Learning},
author={Srinivasan, Krishna and Raman, Karthik and Chen, Jiecao and Bendersky, Michael and Najork, Marc},
journal={arXiv preprint arXiv:2103.01913},
year={2021}
}
@inproceedings{sharma2018conceptual,
title={Conceptual captions: A cleaned, hypernymed, image alt-text dataset for automatic image captioning},
author={Sharma, Piyush and Ding, Nan and Goodman, Sebastian and Soricut, Radu},
booktitle={Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)},
pages={2556--2565},
year={2018}
}
@inproceedings{changpinyo2021conceptual,
title={Conceptual 12m: Pushing web-scale image-text pre-training to recognize long-tail visual concepts},
author={Changpinyo, Soravit and Sharma, Piyush and Ding, Nan and Soricut, Radu},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
pages={3558--3568},
year={2021}
}
@inproceedings{ponttuset2020localized,
author = {Jordi Pont-Tuset and Jasper Uijlings and Soravit Changpinyo and Radu Soricut and Vittorio Ferrari},
title = {Connecting Vision and Language with Localized Narratives},
booktitle = {ECCV},
year = {2020}
}
@article{thomee2016yfcc100m,
title={YFCC100M: The new data in multimedia research},
author={Thomee, Bart and Shamma, David A and Friedland, Gerald and Elizalde, Benjamin and Ni, Karl and Poland, Douglas and Borth, Damian and Li, Li-Jia},
journal={Communications of the ACM},
volume={59},
number={2},
pages={64--73},
year={2016},
publisher={ACM New York, NY, USA}
}
@misc{desai2021redcaps,
title={RedCaps: web-curated image-text data created by the people, for the people},
author={Karan Desai and Gaurav Kaul and Zubin Aysola and Justin Johnson},
year={2021},
eprint={2111.11431},
archivePrefix={arXiv},
primaryClass={cs.CV}
}
```
### Contributions
Thanks to [@aps](https://github.com/apsdehal), [Thomas Wang](https://huggingface.co/TimeRobber), and [@VictorSanh](https://huggingface.co/VictorSanh) for adding this dataset. |
mcimpoi/alot | ---
task_categories:
- image-classification
language:
- en
pretty_name: Amsterdam Library of Textures (ALOT)
size_categories:
- 10K<n<100K
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '1'
'1': '10'
'2': '100'
'3': '101'
'4': '102'
'5': '103'
'6': '104'
'7': '105'
'8': '106'
'9': '107'
'10': '108'
'11': '109'
'12': '11'
'13': '110'
'14': '111'
'15': '112'
'16': '113'
'17': '114'
'18': '115'
'19': '116'
'20': '117'
'21': '118'
'22': '119'
'23': '12'
'24': '120'
'25': '121'
'26': '122'
'27': '123'
'28': '124'
'29': '125'
'30': '126'
'31': '127'
'32': '128'
'33': '129'
'34': '13'
'35': '130'
'36': '131'
'37': '132'
'38': '133'
'39': '134'
'40': '135'
'41': '136'
'42': '137'
'43': '138'
'44': '139'
'45': '14'
'46': '140'
'47': '141'
'48': '142'
'49': '143'
'50': '144'
'51': '145'
'52': '146'
'53': '147'
'54': '148'
'55': '149'
'56': '15'
'57': '150'
'58': '151'
'59': '152'
'60': '153'
'61': '154'
'62': '155'
'63': '156'
'64': '157'
'65': '158'
'66': '159'
'67': '16'
'68': '160'
'69': '161'
'70': '162'
'71': '163'
'72': '164'
'73': '165'
'74': '166'
'75': '167'
'76': '168'
'77': '169'
'78': '17'
'79': '170'
'80': '171'
'81': '172'
'82': '173'
'83': '174'
'84': '175'
'85': '176'
'86': '177'
'87': '178'
'88': '179'
'89': '18'
'90': '180'
'91': '181'
'92': '182'
'93': '183'
'94': '184'
'95': '185'
'96': '186'
'97': '187'
'98': '188'
'99': '189'
'100': '19'
'101': '190'
'102': '191'
'103': '192'
'104': '193'
'105': '194'
'106': '195'
'107': '196'
'108': '197'
'109': '198'
'110': '199'
'111': '2'
'112': '20'
'113': '200'
'114': '201'
'115': '202'
'116': '203'
'117': '204'
'118': '205'
'119': '206'
'120': '207'
'121': '208'
'122': '209'
'123': '21'
'124': '210'
'125': '211'
'126': '212'
'127': '213'
'128': '214'
'129': '215'
'130': '216'
'131': '217'
'132': '218'
'133': '219'
'134': '22'
'135': '220'
'136': '221'
'137': '222'
'138': '223'
'139': '224'
'140': '225'
'141': '226'
'142': '227'
'143': '228'
'144': '229'
'145': '23'
'146': '230'
'147': '231'
'148': '232'
'149': '233'
'150': '234'
'151': '235'
'152': '236'
'153': '237'
'154': '238'
'155': '239'
'156': '24'
'157': '240'
'158': '241'
'159': '242'
'160': '243'
'161': '244'
'162': '245'
'163': '246'
'164': '247'
'165': '248'
'166': '249'
'167': '25'
'168': '250'
'169': '26'
'170': '27'
'171': '28'
'172': '29'
'173': '3'
'174': '30'
'175': '31'
'176': '32'
'177': '33'
'178': '34'
'179': '35'
'180': '36'
'181': '37'
'182': '38'
'183': '39'
'184': '4'
'185': '40'
'186': '41'
'187': '42'
'188': '43'
'189': '44'
'190': '45'
'191': '46'
'192': '47'
'193': '48'
'194': '49'
'195': '5'
'196': '50'
'197': '51'
'198': '52'
'199': '53'
'200': '54'
'201': '55'
'202': '56'
'203': '57'
'204': '58'
'205': '59'
'206': '6'
'207': '60'
'208': '61'
'209': '62'
'210': '63'
'211': '64'
'212': '65'
'213': '66'
'214': '67'
'215': '68'
'216': '69'
'217': '7'
'218': '70'
'219': '71'
'220': '72'
'221': '73'
'222': '74'
'223': '75'
'224': '76'
'225': '77'
'226': '78'
'227': '79'
'228': '8'
'229': '80'
'230': '81'
'231': '82'
'232': '83'
'233': '84'
'234': '85'
'235': '86'
'236': '87'
'237': '88'
'238': '89'
'239': '9'
'240': '90'
'241': '91'
'242': '92'
'243': '93'
'244': '94'
'245': '95'
'246': '96'
'247': '97'
'248': '98'
'249': '99'
splits:
- name: train
num_bytes: 3302794460.0
num_examples: 20000
- name: test
num_bytes: 411146945.0
num_examples: 2500
- name: dev
num_bytes: 415575782.5
num_examples: 2500
download_size: 4104421810
dataset_size: 4129517187.5
---
# Dataset Card for Amsterdam Library of Textures (ALOT)
## Dataset Description
- **Homepage:** https://aloi.science.uva.nl/public_alot/
- **Paper:** G. J. Burghouts and J. M. Geusebroek, Material-specific adaptation of color invariant features,
Pattern Recognition Letters, vol. 30, 306-313, 2009
### Licensing Information
Not known, see website
### Citation Information
@article{burghouts2009material,
title={Material-specific adaptation of color invariant features},
author={Burghouts, Gertjan J and Geusebroek, Jan-Mark},
journal={Pattern Recognition Letters},
volume={30},
number={3},
pages={306--313},
year={2009},
publisher={Elsevier}
} |
rifatul123/NoN_generic_248218_type_indian_drug_cleaned | ---
dataset_info:
features:
- name: Uses
dtype: string
- name: SIDEEFFECT
dtype: string
- name: NAME
dtype: string
- name: CLASS
dtype: string
splits:
- name: train
num_bytes: 82166698
num_examples: 248218
download_size: 18116310
dataset_size: 82166698
---
# Dataset Card for "NoN_generic_248218_type_indian_drug_cleaned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NomeIncrivel/Cellbit | ---
license: openrail
---
|
CalvinU/project-gutenberg | ---
license: mit
---
|
CyberHarem/haniyasushin_keiki_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of haniyasushin_keiki/埴安神袿姫/하니야스신케이키 (Touhou)
This is the dataset of haniyasushin_keiki/埴安神袿姫/하니야스신케이키 (Touhou), containing 500 images and their tags.
The core tags of this character are `blue_hair, long_hair, green_headwear, bangs, ribbon, arm_ribbon, breasts, red_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 674.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haniyasushin_keiki_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 370.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haniyasushin_keiki_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1162 | 771.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haniyasushin_keiki_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 594.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/haniyasushin_keiki_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1162 | 1.09 GiB | [Download](https://huggingface.co/datasets/CyberHarem/haniyasushin_keiki_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/haniyasushin_keiki_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, green_apron, head_scarf, magatama_necklace, single_strap, smile, solo, yellow_dress, between_fingers, looking_at_viewer, open_mouth, tools, simple_background, black_background, short_sleeves, purple_eyes, upper_body |
| 1 | 6 |  |  |  |  |  | 1girl, between_fingers, flower, green_apron, head_scarf, looking_at_viewer, magatama_necklace, smile, solo, yellow_dress, blue_ribbon, open_mouth, puffy_short_sleeves, purple_eyes, tools, fire, pocket, single_strap |
| 2 | 10 |  |  |  |  |  | 1girl, green_apron, head_scarf, magatama_necklace, single_strap, smile, solo, yellow_dress, closed_mouth, looking_at_viewer, between_fingers, tools, pink_eyes, blush, puffy_short_sleeves, flower, pocket, purple_eyes |
| 3 | 8 |  |  |  |  |  | 1girl, between_fingers, blue_ribbon, green_apron, head_scarf, looking_at_viewer, magatama_necklace, pocket, sandals, simple_background, single_strap, smile, solo, tools, white_background, yellow_dress, full_body, purple_eyes, standing, flower, wide_sleeves, fire, juliet_sleeves, pink_eyes, closed_mouth |
| 4 | 5 |  |  |  |  |  | 1girl, between_fingers, flower, green_apron, head_scarf, looking_at_viewer, magatama_necklace, open_mouth, pink_eyes, solo, tools, yellow_dress, pocket, blue_ribbon, puffy_sleeves, :d, long_sleeves, short_sleeves, upper_body, wide_sleeves |
| 5 | 5 |  |  |  |  |  | 1girl, barefoot, black_background, full_body, green_apron, head_scarf, holding, looking_at_viewer, magatama_necklace, short_sleeves, simple_background, solo, yellow_dress, closed_mouth, single_strap, black_eyes, puffy_sleeves, standing |
| 6 | 6 |  |  |  |  |  | 1girl, between_fingers, green_apron, green_belt, green_scarf, head_scarf, looking_at_viewer, magatama_necklace, open_mouth, pink_eyes, pocket, simple_background, smile, solo, yellow_dress, blue_ribbon, medium_breasts, puffy_short_sleeves, standing, tools, white_background, white_flower, blush, hair_between_eyes, hands_up, frills, yellow_sleeves |
| 7 | 7 |  |  |  |  |  | 1girl, large_breasts, simple_background, solo, blush, head_scarf, navel, purple_eyes, collarbone, looking_at_viewer, upper_body, closed_mouth, nude, puffy_nipples, armpits, hair_between_eyes, shiny, sweat, white_background |
| 8 | 5 |  |  |  |  |  | 1girl, blush, hetero, large_breasts, nipples, solo_focus, 1boy, bar_censor, navel, open_mouth, penis, completely_nude, magatama, spread_legs, vaginal, cowgirl_position, cum_in_pussy, hair_between_eyes, head_scarf, heart-shaped_pupils, jewelry, sex_from_behind, simple_background, sweat, tongue_out |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | green_apron | head_scarf | magatama_necklace | single_strap | smile | solo | yellow_dress | between_fingers | looking_at_viewer | open_mouth | tools | simple_background | black_background | short_sleeves | purple_eyes | upper_body | flower | blue_ribbon | puffy_short_sleeves | fire | pocket | closed_mouth | pink_eyes | blush | sandals | white_background | full_body | standing | wide_sleeves | juliet_sleeves | puffy_sleeves | :d | long_sleeves | barefoot | holding | black_eyes | green_belt | green_scarf | medium_breasts | white_flower | hair_between_eyes | hands_up | frills | yellow_sleeves | large_breasts | navel | collarbone | nude | puffy_nipples | armpits | shiny | sweat | hetero | nipples | solo_focus | 1boy | bar_censor | penis | completely_nude | magatama | spread_legs | vaginal | cowgirl_position | cum_in_pussy | heart-shaped_pupils | jewelry | sex_from_behind | tongue_out |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:-------------|:--------------------|:---------------|:--------|:-------|:---------------|:------------------|:--------------------|:-------------|:--------|:--------------------|:-------------------|:----------------|:--------------|:-------------|:---------|:--------------|:----------------------|:-------|:---------|:---------------|:------------|:--------|:----------|:-------------------|:------------|:-----------|:---------------|:-----------------|:----------------|:-----|:---------------|:-----------|:----------|:-------------|:-------------|:--------------|:-----------------|:---------------|:--------------------|:-----------|:---------|:-----------------|:----------------|:--------|:-------------|:-------|:----------------|:----------|:--------|:--------|:---------|:----------|:-------------|:-------|:-------------|:--------|:------------------|:-----------|:--------------|:----------|:-------------------|:---------------|:----------------------|:----------|:------------------|:-------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | X | | | | X | | X | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | X | X | | | X | | X | X | | X | X | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | X | X | | | X | X | X | X | X | X | | | X | | X | X | X | | | X | | X | | | | | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | X | X | | X | X | | X | | | X | X | X | | | | | | | | X | | | | | X | X | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | X | X | X | | X | X | X | X | X | X | X | X | | | | | | X | X | | X | | X | X | | X | | X | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | | X | | | | X | | | X | | | X | | | X | X | | | | | | X | | X | | X | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 8 | 5 |  |  |  |  |  | X | | X | | | | | | | | X | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
anan-2024/twitter_dataset_1713168237 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 193602
num_examples: 515
download_size: 105585
dataset_size: 193602
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gokuls/glue_augmented_wnli | ---
license: apache-2.0
---
# Dataset Card for glue_augmented_wnli
## Dataset Description
Augmented WNLI dataset
**Reference:** https://huggingface.co/datasets/glue |
NeelNanda/pile-small-tokenized-2b | ---
dataset_info:
features:
- name: tokens
sequence: int32
splits:
- name: train
num_bytes: 44263497500
num_examples: 10795975
download_size: 19763664789
dataset_size: 44263497500
---
# Dataset Card for "pile-small-tokenized-2b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
isaaccorley/RoadTracer | ---
license: other
license_name: google-maps-platform-tos
license_link: https://cloud.google.com/maps-platform/terms
---
|
joey234/mmlu-professional_medicine-neg-prepend-verbal | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: neg_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
splits:
- name: dev
num_bytes: 14928
num_examples: 5
- name: test
num_bytes: 3884005
num_examples: 272
download_size: 475393
dataset_size: 3898933
---
# Dataset Card for "mmlu-professional_medicine-neg-prepend-verbal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
freshpearYoon/v3_val_free_concat_4 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 3842869832
num_examples: 2500
download_size: 1933496133
dataset_size: 3842869832
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Arham-Imran/Cityscape | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 6899826420.175
num_examples: 2975
- name: val
num_bytes: 1198480272.0
num_examples: 500
download_size: 8228847024
dataset_size: 8098306692.175
---
# Dataset Card for "Cityscape"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vivym/midjourney-prompts | ---
license: apache-2.0
task_categories:
- text-to-image
tags:
- ' midjourney'
language:
- en
---
# midjourney-prompts
## Description
This dataset contains the cleaned midjourney prompts from Midjourney.
Total prompts: 9,085,397
| Version | Count |
| ------- | --------- |
| 5.2 | 2,272,465 |
| 5.1 | 2,060,106 |
| 5.0 | 3,530,770 |
| 4.0 | 1,204,384 |
| 3.0 | 14,991 |
| 2.0 | 791 |
| 1.0 | 1,239 |
| Style | Count |
| --------- | ----------- |
| default | 8,874,181 |
| raw | 177,953 |
| expressive| 27,919 |
| scenic | 2,146 |
| cute | 2,036 |
| original | 511 | |
Amani123/donutdata2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 77291761.0
num_examples: 96
- name: test
num_bytes: 4621446.0
num_examples: 6
- name: validation
num_bytes: 9222827.0
num_examples: 11
download_size: 90088586
dataset_size: 91136034.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
# Dataset Card for "donutdata2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HumanCompatibleAI/ppo-seals-Hopper-v1 | ---
dataset_info:
features:
- name: obs
sequence:
sequence: float64
- name: acts
sequence:
sequence: float32
- name: infos
sequence: string
- name: terminal
dtype: bool
- name: rews
sequence: float32
splits:
- name: train
num_bytes: 57153894
num_examples: 104
download_size: 12420708
dataset_size: 57153894
---
# Dataset Card for "ppo-seals-Hopper-v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rakanishu-Lau/chinese-poems | ---
license: other
---
|
CyberHarem/momo_sakurabara_areyoutheonlyonewholovesme | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Momo Sakurabara/桜原桃 (Are you the only one who loves me?)
This is the dataset of Momo Sakurabara/桜原桃 (Are you the only one who loves me?), containing 167 images and their tags.
The core tags of this character are `pink_hair, hair_rings, hair_ornament, long_hair, pink_eyes, mole, mole_under_mouth`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 167 | 121.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/momo_sakurabara_areyoutheonlyonewholovesme/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 167 | 121.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/momo_sakurabara_areyoutheonlyonewholovesme/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 320 | 220.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/momo_sakurabara_areyoutheonlyonewholovesme/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/momo_sakurabara_areyoutheonlyonewholovesme',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, solo, open_mouth, shirt, short_sleeves, twintails, looking_at_viewer, upper_body, school_uniform, v-shaped_eyebrows, blush, crossed_arms, ribbon |
| 1 | 14 |  |  |  |  |  | 1girl, plaid_skirt, school_uniform, short_sleeves, pleated_skirt, solo, red_skirt, blush, white_shirt, open_mouth, black_socks, breasts, kneehighs, sidelocks |
| 2 | 5 |  |  |  |  |  | bookshelf, looking_at_viewer, plaid_skirt, pleated_skirt, red_skirt, school_uniform, short_sleeves, 1girl, blue_ribbon, indoors, solo, clenched_hands, neck_ribbon, sidelocks, v-shaped_eyebrows, white_shirt, frown |
| 3 | 6 |  |  |  |  |  | 1girl, blue_ribbon, day, hair_between_eyes, outdoors, sidelocks, solo, upper_body, neck_ribbon, tree, closed_mouth, blue_sky, cloud, smile, white_shirt |
| 4 | 5 |  |  |  |  |  | 1girl, day, hair_between_eyes, looking_at_viewer, outdoors, solo, blush, cloud, flower, upper_body, blue_sky, closed_mouth, neck_ribbon, school_uniform, twintails, anime_coloring, blue_ribbon, open_mouth, shirt, sidelocks |
| 5 | 5 |  |  |  |  |  | 1girl, solo, school_uniform, :d, open_mouth, ^_^, twintails, upper_body, waving |
| 6 | 6 |  |  |  |  |  | 1girl, closed_eyes, solo, blush, open_mouth, parody, twintails, hair_bobbles |
| 7 | 6 |  |  |  |  |  | 1girl, school_uniform, smile, solo, upper_body, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | open_mouth | shirt | short_sleeves | twintails | looking_at_viewer | upper_body | school_uniform | v-shaped_eyebrows | blush | crossed_arms | ribbon | plaid_skirt | pleated_skirt | red_skirt | white_shirt | black_socks | breasts | kneehighs | sidelocks | bookshelf | blue_ribbon | indoors | clenched_hands | neck_ribbon | frown | day | hair_between_eyes | outdoors | tree | closed_mouth | blue_sky | cloud | smile | flower | anime_coloring | :d | ^_^ | waving | closed_eyes | parody | hair_bobbles |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------------|:--------|:----------------|:------------|:--------------------|:-------------|:-----------------|:--------------------|:--------|:---------------|:---------|:--------------|:----------------|:------------|:--------------|:--------------|:----------|:------------|:------------|:------------|:--------------|:----------|:-----------------|:--------------|:--------|:------|:--------------------|:-----------|:-------|:---------------|:-----------|:--------|:--------|:---------|:-----------------|:-----|:------|:---------|:--------------|:---------|:---------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | X | X | | X | | | | X | | X | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | | X | | X | | X | X | | | | X | X | X | X | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | | | | | | X | | | | | | | | | X | | | | X | | X | | | X | | X | X | X | X | X | X | X | X | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | X | X | | X | X | X | X | | X | | | | | | | | | | X | | X | | | X | | X | X | X | | X | X | X | | X | X | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | |
| 6 | 6 |  |  |  |  |  | X | X | X | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X |
| 7 | 6 |  |  |  |  |  | X | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | |
|
PBWR/Building3D | ---
license: apache-2.0
---
|
DuongTrongChi/data-classification | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: isFollowTheCorrectTask
dtype: int64
splits:
- name: train
num_bytes: 10097442
num_examples: 10000
download_size: 0
dataset_size: 10097442
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data-classification"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HuggingFaceH4/aws-pm-pilot | ---
license: apache-2.0
---
Pilot annotations for PM dataset that will be used for RLHF. The dataset used outputs from opensource models (https://huggingface.co/spaces/HuggingFaceH4/instruction-models-outputs) on a mix on Anthropic hh-rlhf (https://huggingface.co/datasets/HuggingFaceH4/hh-rlhf) dataset and Self-Instruct's seed (https://huggingface.co/datasets/HuggingFaceH4/self-instruct-seed) dataset. |
camilo03soares/milo.mp3 | ---
license: openrail
---
|
open-llm-leaderboard/details_BEE-spoke-data__zephyr-220m-dpo-full | ---
pretty_name: Evaluation run of BEE-spoke-data/zephyr-220m-dpo-full
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BEE-spoke-data/zephyr-220m-dpo-full](https://huggingface.co/BEE-spoke-data/zephyr-220m-dpo-full)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BEE-spoke-data__zephyr-220m-dpo-full\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T04:32:33.100189](https://huggingface.co/datasets/open-llm-leaderboard/details_BEE-spoke-data__zephyr-220m-dpo-full/blob/main/results_2024-01-05T04-32-33.100189.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2633144761549974,\n\
\ \"acc_stderr\": 0.031001098499088355,\n \"acc_norm\": 0.2646278371332489,\n\
\ \"acc_norm_stderr\": 0.03179755881351347,\n \"mc1\": 0.25091799265605874,\n\
\ \"mc1_stderr\": 0.01517698502770769,\n \"mc2\": 0.43441567768341954,\n\
\ \"mc2_stderr\": 0.015533533425843614\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2030716723549488,\n \"acc_stderr\": 0.011755899303705582,\n\
\ \"acc_norm\": 0.25426621160409557,\n \"acc_norm_stderr\": 0.012724999945157738\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.276638119896435,\n\
\ \"acc_stderr\": 0.004464217420693376,\n \"acc_norm\": 0.2914758016331408,\n\
\ \"acc_norm_stderr\": 0.004535133886462045\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.21481481481481482,\n\
\ \"acc_stderr\": 0.035478541985608264,\n \"acc_norm\": 0.21481481481481482,\n\
\ \"acc_norm_stderr\": 0.035478541985608264\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03317672787533157,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03317672787533157\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080343,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080343\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.30057803468208094,\n\
\ \"acc_stderr\": 0.03496101481191181,\n \"acc_norm\": 0.30057803468208094,\n\
\ \"acc_norm_stderr\": 0.03496101481191181\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.028659179374292316,\n\
\ \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.028659179374292316\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.15172413793103448,\n \"acc_stderr\": 0.029896107594574617,\n\
\ \"acc_norm\": 0.15172413793103448,\n \"acc_norm_stderr\": 0.029896107594574617\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948368,\n \"\
acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948368\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.0361960452412425,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.0361960452412425\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.31290322580645163,\n\
\ \"acc_stderr\": 0.02637756702864586,\n \"acc_norm\": 0.31290322580645163,\n\
\ \"acc_norm_stderr\": 0.02637756702864586\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\"\
: 0.16,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.033464098810559534,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.033464098810559534\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2676767676767677,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.2676767676767677,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.37305699481865284,\n \"acc_stderr\": 0.03490205592048573,\n\
\ \"acc_norm\": 0.37305699481865284,\n \"acc_norm_stderr\": 0.03490205592048573\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3282051282051282,\n \"acc_stderr\": 0.023807633198657266,\n\
\ \"acc_norm\": 0.3282051282051282,\n \"acc_norm_stderr\": 0.023807633198657266\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184407,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184407\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.02755361446786379,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02755361446786379\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3229357798165138,\n \"acc_stderr\": 0.020048115923415332,\n \"\
acc_norm\": 0.3229357798165138,\n \"acc_norm_stderr\": 0.020048115923415332\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2696078431372549,\n \"acc_stderr\": 0.031145570659486782,\n \"\
acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.031145570659486782\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.29957805907172996,\n \"acc_stderr\": 0.0298180247497531,\n \
\ \"acc_norm\": 0.29957805907172996,\n \"acc_norm_stderr\": 0.0298180247497531\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3004484304932735,\n\
\ \"acc_stderr\": 0.030769352008229136,\n \"acc_norm\": 0.3004484304932735,\n\
\ \"acc_norm_stderr\": 0.030769352008229136\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728744,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728744\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.03957835471980979,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.03957835471980979\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.03351953879521271,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.03351953879521271\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.18376068376068377,\n\
\ \"acc_stderr\": 0.025372139671722933,\n \"acc_norm\": 0.18376068376068377,\n\
\ \"acc_norm_stderr\": 0.025372139671722933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2669220945083014,\n\
\ \"acc_stderr\": 0.015818450894777576,\n \"acc_norm\": 0.2669220945083014,\n\
\ \"acc_norm_stderr\": 0.015818450894777576\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.23121387283236994,\n \"acc_stderr\": 0.022698657167855716,\n\
\ \"acc_norm\": 0.23121387283236994,\n \"acc_norm_stderr\": 0.022698657167855716\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574877,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574877\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02564686309713791,\n\
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02564686309713791\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1832797427652733,\n\
\ \"acc_stderr\": 0.021974198848265823,\n \"acc_norm\": 0.1832797427652733,\n\
\ \"acc_norm_stderr\": 0.021974198848265823\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.23765432098765432,\n \"acc_stderr\": 0.023683591837008553,\n\
\ \"acc_norm\": 0.23765432098765432,\n \"acc_norm_stderr\": 0.023683591837008553\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2872340425531915,\n \"acc_stderr\": 0.026992199173064356,\n \
\ \"acc_norm\": 0.2872340425531915,\n \"acc_norm_stderr\": 0.026992199173064356\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2503259452411995,\n\
\ \"acc_stderr\": 0.01106415102716543,\n \"acc_norm\": 0.2503259452411995,\n\
\ \"acc_norm_stderr\": 0.01106415102716543\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329376,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329376\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2369281045751634,\n \"acc_stderr\": 0.01720166216978978,\n \
\ \"acc_norm\": 0.2369281045751634,\n \"acc_norm_stderr\": 0.01720166216978978\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3836734693877551,\n \"acc_stderr\": 0.031130880396235943,\n\
\ \"acc_norm\": 0.3836734693877551,\n \"acc_norm_stderr\": 0.031130880396235943\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.22289156626506024,\n\
\ \"acc_stderr\": 0.03240004825594687,\n \"acc_norm\": 0.22289156626506024,\n\
\ \"acc_norm_stderr\": 0.03240004825594687\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.23391812865497075,\n \"acc_stderr\": 0.032467217651178264,\n\
\ \"acc_norm\": 0.23391812865497075,\n \"acc_norm_stderr\": 0.032467217651178264\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25091799265605874,\n\
\ \"mc1_stderr\": 0.01517698502770769,\n \"mc2\": 0.43441567768341954,\n\
\ \"mc2_stderr\": 0.015533533425843614\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5098658247829518,\n \"acc_stderr\": 0.014049749833367592\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \
\ \"acc_stderr\": 0.002001305720948082\n }\n}\n```"
repo_url: https://huggingface.co/BEE-spoke-data/zephyr-220m-dpo-full
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|arc:challenge|25_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|gsm8k|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hellaswag|10_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-32-33.100189.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T04-32-33.100189.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- '**/details_harness|winogrande|5_2024-01-05T04-32-33.100189.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T04-32-33.100189.parquet'
- config_name: results
data_files:
- split: 2024_01_05T04_32_33.100189
path:
- results_2024-01-05T04-32-33.100189.parquet
- split: latest
path:
- results_2024-01-05T04-32-33.100189.parquet
---
# Dataset Card for Evaluation run of BEE-spoke-data/zephyr-220m-dpo-full
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BEE-spoke-data/zephyr-220m-dpo-full](https://huggingface.co/BEE-spoke-data/zephyr-220m-dpo-full) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BEE-spoke-data__zephyr-220m-dpo-full",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T04:32:33.100189](https://huggingface.co/datasets/open-llm-leaderboard/details_BEE-spoke-data__zephyr-220m-dpo-full/blob/main/results_2024-01-05T04-32-33.100189.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2633144761549974,
"acc_stderr": 0.031001098499088355,
"acc_norm": 0.2646278371332489,
"acc_norm_stderr": 0.03179755881351347,
"mc1": 0.25091799265605874,
"mc1_stderr": 0.01517698502770769,
"mc2": 0.43441567768341954,
"mc2_stderr": 0.015533533425843614
},
"harness|arc:challenge|25": {
"acc": 0.2030716723549488,
"acc_stderr": 0.011755899303705582,
"acc_norm": 0.25426621160409557,
"acc_norm_stderr": 0.012724999945157738
},
"harness|hellaswag|10": {
"acc": 0.276638119896435,
"acc_stderr": 0.004464217420693376,
"acc_norm": 0.2914758016331408,
"acc_norm_stderr": 0.004535133886462045
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.21481481481481482,
"acc_stderr": 0.035478541985608264,
"acc_norm": 0.21481481481481482,
"acc_norm_stderr": 0.035478541985608264
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080343,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080343
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.30057803468208094,
"acc_stderr": 0.03496101481191181,
"acc_norm": 0.30057803468208094,
"acc_norm_stderr": 0.03496101481191181
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.25957446808510637,
"acc_stderr": 0.028659179374292316,
"acc_norm": 0.25957446808510637,
"acc_norm_stderr": 0.028659179374292316
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.15172413793103448,
"acc_stderr": 0.029896107594574617,
"acc_norm": 0.15172413793103448,
"acc_norm_stderr": 0.029896107594574617
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.022182037202948368,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.022182037202948368
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.0361960452412425,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.0361960452412425
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.31290322580645163,
"acc_stderr": 0.02637756702864586,
"acc_norm": 0.31290322580645163,
"acc_norm_stderr": 0.02637756702864586
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.033464098810559534,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.033464098810559534
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2676767676767677,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.2676767676767677,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.37305699481865284,
"acc_stderr": 0.03490205592048573,
"acc_norm": 0.37305699481865284,
"acc_norm_stderr": 0.03490205592048573
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3282051282051282,
"acc_stderr": 0.023807633198657266,
"acc_norm": 0.3282051282051282,
"acc_norm_stderr": 0.023807633198657266
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.02659393910184407,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.02659393910184407
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.02755361446786379,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.02755361446786379
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3229357798165138,
"acc_stderr": 0.020048115923415332,
"acc_norm": 0.3229357798165138,
"acc_norm_stderr": 0.020048115923415332
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.29957805907172996,
"acc_stderr": 0.0298180247497531,
"acc_norm": 0.29957805907172996,
"acc_norm_stderr": 0.0298180247497531
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3004484304932735,
"acc_stderr": 0.030769352008229136,
"acc_norm": 0.3004484304932735,
"acc_norm_stderr": 0.030769352008229136
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.03957835471980979,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.03957835471980979
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.03351953879521271,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.03351953879521271
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.18376068376068377,
"acc_stderr": 0.025372139671722933,
"acc_norm": 0.18376068376068377,
"acc_norm_stderr": 0.025372139671722933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2669220945083014,
"acc_stderr": 0.015818450894777576,
"acc_norm": 0.2669220945083014,
"acc_norm_stderr": 0.015818450894777576
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.022698657167855716,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.022698657167855716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574877,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574877
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02564686309713791,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02564686309713791
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1832797427652733,
"acc_stderr": 0.021974198848265823,
"acc_norm": 0.1832797427652733,
"acc_norm_stderr": 0.021974198848265823
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.23765432098765432,
"acc_stderr": 0.023683591837008553,
"acc_norm": 0.23765432098765432,
"acc_norm_stderr": 0.023683591837008553
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2872340425531915,
"acc_stderr": 0.026992199173064356,
"acc_norm": 0.2872340425531915,
"acc_norm_stderr": 0.026992199173064356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2503259452411995,
"acc_stderr": 0.01106415102716543,
"acc_norm": 0.2503259452411995,
"acc_norm_stderr": 0.01106415102716543
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329376,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329376
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2369281045751634,
"acc_stderr": 0.01720166216978978,
"acc_norm": 0.2369281045751634,
"acc_norm_stderr": 0.01720166216978978
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3836734693877551,
"acc_stderr": 0.031130880396235943,
"acc_norm": 0.3836734693877551,
"acc_norm_stderr": 0.031130880396235943
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-virology|5": {
"acc": 0.22289156626506024,
"acc_stderr": 0.03240004825594687,
"acc_norm": 0.22289156626506024,
"acc_norm_stderr": 0.03240004825594687
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23391812865497075,
"acc_stderr": 0.032467217651178264,
"acc_norm": 0.23391812865497075,
"acc_norm_stderr": 0.032467217651178264
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25091799265605874,
"mc1_stderr": 0.01517698502770769,
"mc2": 0.43441567768341954,
"mc2_stderr": 0.015533533425843614
},
"harness|winogrande|5": {
"acc": 0.5098658247829518,
"acc_stderr": 0.014049749833367592
},
"harness|gsm8k|5": {
"acc": 0.00530705079605762,
"acc_stderr": 0.002001305720948082
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
xuese99/hyp | ---
license: mit
task_categories:
- text-classification
- summarization
language:
- zh
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ShinDC/important_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 8618263476
num_examples: 16702061
- name: valid
num_bytes: 48072624
num_examples: 93164
download_size: 3804670316
dataset_size: 8666336100
---
# Dataset Card for "important_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cellos/test1 | ---
license: mit
---
|
erdometo/tquad-v1v2-reformat | ---
license: other
license_name: original
license_link: https://huggingface.co/datasets/husnu/tquad-v1v2
---
|
jerteh/SrpWiki | ---
license: cc-by-4.0
language:
- sr
pretty_name: Serbian WikiMedia dataset
size_categories:
- 10M<n<100M
configs:
- config_name: default
data_files:
- split: train
path:
- WikiKorpus.txt
- Sveznanje.txt
task_categories:
- text-generation
---
Dataset contain text from Wikipedia articles in Serbian (obtained in early 2020) totaling in 477473 articles (70 million words), as well as some of the WikiSource.
Dataset is constituted of a single text file that can be loaded via:
```python
from datasets import load_dataset
dataset = load_dataset("jerteh/SrpWiki")
```
Preview:
```python
print(dataset["train"][1000])
{'text': 'ADUKCIJA (lat.: dovođenje), približavanje uda ili njegovog dela sr. liniji čovekovog tela; supr. → abdukcija.'}
``` |
amitness/logits-mt-128 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
- name: teacher_logits
sequence:
sequence: float64
- name: teacher_indices
sequence:
sequence: int64
- name: teacher_mask_indices
sequence: int64
splits:
- name: train
num_bytes: 196137984.23649114
num_examples: 43274
- name: test
num_bytes: 34614451.76350887
num_examples: 7637
download_size: 0
dataset_size: 230752436.0
---
# Dataset Card for "logits-mt-128"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-high_school_mathematics-rule-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 115352
num_examples: 270
download_size: 67562
dataset_size: 115352
---
# Dataset Card for "mmlu-high_school_mathematics-rule-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deadbits/vigil-jailbreak-all-MiniLM-L6-v2 | ---
tags:
- embeddings
- text
- security
pretty_name: 'Vigil: LLM Jailbreak all-MiniLM-L6-v2'
---
# Vigil: LLM Jailbreak all-MiniLM-L6-v2
- **Repo:** [github.com/deadbits/vigil-llm](https://github.com/deadbits/vigil-llm)
`Vigil` is a Python framework and REST API for assessing Large Language Model (LLM) prompts against a set of scanners to detect prompt injections, jailbreaks, and other potentially risky inputs.
This repository contains `all-MiniLM-L6-v2` embeddings for all "jailbreak" prompts used by [Vigil](https://github.com/deadbits/pvigil-llm).
You can use the [parquet2vdb.py](https://github.com/deadbits/vigil-llm/blob/main/vigil/utils/parquet2vdb.py) utility to load the embeddings in the Vigil chromadb instance, or use them in your own application.
## Format
```json
[
{
"text": str,
"embedding": [],
"model": "all-MiniLM-L6-v2"
}
}
]
```
Jailbreak prompts sourced from: https://github.com/laiyer-ai/llm-guard/blob/399cb2eea70afc78482db226253ddd1d85f296e3/llm_guard/resources/jailbreak.json |
deepghs/anime_ch_hair_color | ---
license: mit
task_categories:
- image-classification
tags:
- art
size_categories:
- 10K<n<100K
--- |
razhan/riste | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 547191151
num_examples: 4608996
download_size: 316570906
dataset_size: 547191151
language:
- ku
license: odc-by
tags:
- kurdi
- ckb
- sorani
- kurdish
- central kurdish
pretty_name: Riste
size_categories:
- 1M<n<10M
---
# Riste - ڕستە
This dataset contains 4.6M unique sentences extracted from the nllb dataset.
- Total sentences: 4,600,000
- Longest sentence length: 500 characters
- Shortest sentence length: 8 characters
- Total words in corpus: 46,467,500
- Total unique words: 1,880,401
### Licensing Information
The original dataset is released under the terms of [ODC-BY](https://opendatacommons.org/licenses/by/1-0/). By using this, you are also bound to the respective Terms of Use and License of the original source.
# Dataset Card for "rste"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vwxyzjn/summarize_from_feedback_oai_preprocessing_gpt2_48 | ---
dataset_info:
features:
- name: info
struct:
- name: id
dtype: string
- name: post
dtype: string
- name: title
dtype: string
- name: subreddit
dtype: string
- name: site
dtype: string
- name: article
dtype: string
- name: summaries
list:
- name: text
dtype: string
- name: policy
dtype: string
- name: note
dtype: string
- name: choice
dtype: int32
- name: worker
dtype: string
- name: batch
dtype: string
- name: split
dtype: string
- name: extra
struct:
- name: confidence
dtype: int32
- name: query_token
sequence: int64
- name: query
dtype: string
- name: response0
dtype: string
- name: response0_token
sequence: int64
- name: response0_token_len
dtype: int64
- name: response1
dtype: string
- name: response1_token
sequence: int64
- name: response1_token_len
dtype: int64
- name: response0_policy
dtype: string
- name: response1_policy
dtype: string
- name: policies
dtype: string
splits:
- name: train
num_bytes: 790831734
num_examples: 92858
- name: validation
num_bytes: 743452770
num_examples: 86086
download_size: 125252937
dataset_size: 1534284504
---
# Dataset Card for "summarize_from_feedback_oai_preprocessing_gpt2_48"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-30B-Superhot-8K-fp16 | ---
pretty_name: Evaluation run of TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16](https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-30B-Superhot-8K-fp16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-31T18:46:06.024423](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-30B-Superhot-8K-fp16/blob/main/results_2023-07-31T18%3A46%3A06.024423.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23519468841762173,\n\
\ \"acc_stderr\": 0.030867946729594396,\n \"acc_norm\": 0.23665032922383497,\n\
\ \"acc_norm_stderr\": 0.03088234450623421,\n \"mc1\": 0.22766217870257038,\n\
\ \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.4747511496520905,\n\
\ \"mc2_stderr\": 0.016743067237896876\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.22525597269624573,\n \"acc_stderr\": 0.012207839995407312,\n\
\ \"acc_norm\": 0.2619453924914676,\n \"acc_norm_stderr\": 0.012849054826858115\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2804222266480781,\n\
\ \"acc_stderr\": 0.004482874732237348,\n \"acc_norm\": 0.3296156144194384,\n\
\ \"acc_norm_stderr\": 0.004691128722535483\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.1925925925925926,\n\
\ \"acc_stderr\": 0.03406542058502653,\n \"acc_norm\": 0.1925925925925926,\n\
\ \"acc_norm_stderr\": 0.03406542058502653\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.19245283018867926,\n \"acc_stderr\": 0.024262979839372277,\n\
\ \"acc_norm\": 0.19245283018867926,\n \"acc_norm_stderr\": 0.024262979839372277\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135303,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135303\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400168,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400168\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.18064516129032257,\n \"acc_stderr\": 0.02188617856717255,\n \"\
acc_norm\": 0.18064516129032257,\n \"acc_norm_stderr\": 0.02188617856717255\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.03427743175816524,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.03427743175816524\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.18652849740932642,\n \"acc_stderr\": 0.028112091210117447,\n\
\ \"acc_norm\": 0.18652849740932642,\n \"acc_norm_stderr\": 0.028112091210117447\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722127995,\n\
\ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722127995\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23949579831932774,\n \"acc_stderr\": 0.027722065493361255,\n\
\ \"acc_norm\": 0.23949579831932774,\n \"acc_norm_stderr\": 0.027722065493361255\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2361111111111111,\n \"acc_stderr\": 0.02896370257079103,\n \"\
acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.02896370257079103\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598035,\n\
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598035\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.27350427350427353,\n\
\ \"acc_stderr\": 0.029202540153431163,\n \"acc_norm\": 0.27350427350427353,\n\
\ \"acc_norm_stderr\": 0.029202540153431163\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.023083658586984204,\n\
\ \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.023083658586984204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n\
\ \"acc_stderr\": 0.01440029642922562,\n \"acc_norm\": 0.24581005586592178,\n\
\ \"acc_norm_stderr\": 0.01440029642922562\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1832797427652733,\n\
\ \"acc_stderr\": 0.021974198848265805,\n \"acc_norm\": 0.1832797427652733,\n\
\ \"acc_norm_stderr\": 0.021974198848265805\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.20909090909090908,\n \"acc_stderr\": 0.03895091015724136,\n\
\ \"acc_norm\": 0.20909090909090908,\n \"acc_norm_stderr\": 0.03895091015724136\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.22040816326530613,\n\
\ \"acc_stderr\": 0.02653704531214529,\n \"acc_norm\": 0.22040816326530613,\n\
\ \"acc_norm_stderr\": 0.02653704531214529\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.28654970760233917,\n\
\ \"acc_stderr\": 0.034678266857038266,\n \"acc_norm\": 0.28654970760233917,\n\
\ \"acc_norm_stderr\": 0.034678266857038266\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.22766217870257038,\n \"mc1_stderr\": 0.01467925503211107,\n\
\ \"mc2\": 0.4747511496520905,\n \"mc2_stderr\": 0.016743067237896876\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|arc:challenge|25_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hellaswag|10_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T18:46:06.024423.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-31T18:46:06.024423.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-31T18:46:06.024423.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-31T18:46:06.024423.parquet'
- config_name: results
data_files:
- split: 2023_07_31T18_46_06.024423
path:
- results_2023-07-31T18:46:06.024423.parquet
- split: latest
path:
- results_2023-07-31T18:46:06.024423.parquet
---
# Dataset Card for Evaluation run of TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16](https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-30B-Superhot-8K-fp16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-31T18:46:06.024423](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-30B-Superhot-8K-fp16/blob/main/results_2023-07-31T18%3A46%3A06.024423.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23519468841762173,
"acc_stderr": 0.030867946729594396,
"acc_norm": 0.23665032922383497,
"acc_norm_stderr": 0.03088234450623421,
"mc1": 0.22766217870257038,
"mc1_stderr": 0.01467925503211107,
"mc2": 0.4747511496520905,
"mc2_stderr": 0.016743067237896876
},
"harness|arc:challenge|25": {
"acc": 0.22525597269624573,
"acc_stderr": 0.012207839995407312,
"acc_norm": 0.2619453924914676,
"acc_norm_stderr": 0.012849054826858115
},
"harness|hellaswag|10": {
"acc": 0.2804222266480781,
"acc_stderr": 0.004482874732237348,
"acc_norm": 0.3296156144194384,
"acc_norm_stderr": 0.004691128722535483
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.1925925925925926,
"acc_stderr": 0.03406542058502653,
"acc_norm": 0.1925925925925926,
"acc_norm_stderr": 0.03406542058502653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.19245283018867926,
"acc_stderr": 0.024262979839372277,
"acc_norm": 0.19245283018867926,
"acc_norm_stderr": 0.024262979839372277
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135303,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135303
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400168,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400168
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.18064516129032257,
"acc_stderr": 0.02188617856717255,
"acc_norm": 0.18064516129032257,
"acc_norm_stderr": 0.02188617856717255
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.03427743175816524,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.03427743175816524
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.18652849740932642,
"acc_stderr": 0.028112091210117447,
"acc_norm": 0.18652849740932642,
"acc_norm_stderr": 0.028112091210117447
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722127995,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722127995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23949579831932774,
"acc_stderr": 0.027722065493361255,
"acc_norm": 0.23949579831932774,
"acc_norm_stderr": 0.027722065493361255
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.02896370257079103,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.02896370257079103
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598035,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598035
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.27350427350427353,
"acc_stderr": 0.029202540153431163,
"acc_norm": 0.27350427350427353,
"acc_norm_stderr": 0.029202540153431163
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.01440029642922562,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.01440029642922562
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1832797427652733,
"acc_stderr": 0.021974198848265805,
"acc_norm": 0.1832797427652733,
"acc_norm_stderr": 0.021974198848265805
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.03895091015724136,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.03895091015724136
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22040816326530613,
"acc_stderr": 0.02653704531214529,
"acc_norm": 0.22040816326530613,
"acc_norm_stderr": 0.02653704531214529
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22766217870257038,
"mc1_stderr": 0.01467925503211107,
"mc2": 0.4747511496520905,
"mc2_stderr": 0.016743067237896876
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_bofenghuang__vigogne-2-7b-instruct | ---
pretty_name: Evaluation run of bofenghuang/vigogne-2-7b-instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bofenghuang/vigogne-2-7b-instruct](https://huggingface.co/bofenghuang/vigogne-2-7b-instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bofenghuang__vigogne-2-7b-instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T08:45:31.930950](https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-2-7b-instruct/blob/main/results_2023-09-23T08-45-31.930950.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2985528523489933,\n\
\ \"em_stderr\": 0.0046864904941642995,\n \"f1\": 0.3518403942953031,\n\
\ \"f1_stderr\": 0.004613402461586294,\n \"acc\": 0.39622289254314186,\n\
\ \"acc_stderr\": 0.008677803422491042\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2985528523489933,\n \"em_stderr\": 0.0046864904941642995,\n\
\ \"f1\": 0.3518403942953031,\n \"f1_stderr\": 0.004613402461586294\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03790750568612585,\n \
\ \"acc_stderr\": 0.005260333907798437\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7545382794001578,\n \"acc_stderr\": 0.012095272937183646\n\
\ }\n}\n```"
repo_url: https://huggingface.co/bofenghuang/vigogne-2-7b-instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|arc:challenge|25_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T08_45_31.930950
path:
- '**/details_harness|drop|3_2023-09-23T08-45-31.930950.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T08-45-31.930950.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T08_45_31.930950
path:
- '**/details_harness|gsm8k|5_2023-09-23T08-45-31.930950.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T08-45-31.930950.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hellaswag|10_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T10:36:05.447803.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-25T10:36:05.447803.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-25T10:36:05.447803.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T08_45_31.930950
path:
- '**/details_harness|winogrande|5_2023-09-23T08-45-31.930950.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T08-45-31.930950.parquet'
- config_name: results
data_files:
- split: 2023_07_25T10_36_05.447803
path:
- results_2023-07-25T10:36:05.447803.parquet
- split: 2023_09_23T08_45_31.930950
path:
- results_2023-09-23T08-45-31.930950.parquet
- split: latest
path:
- results_2023-09-23T08-45-31.930950.parquet
---
# Dataset Card for Evaluation run of bofenghuang/vigogne-2-7b-instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bofenghuang/vigogne-2-7b-instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [bofenghuang/vigogne-2-7b-instruct](https://huggingface.co/bofenghuang/vigogne-2-7b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bofenghuang__vigogne-2-7b-instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T08:45:31.930950](https://huggingface.co/datasets/open-llm-leaderboard/details_bofenghuang__vigogne-2-7b-instruct/blob/main/results_2023-09-23T08-45-31.930950.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2985528523489933,
"em_stderr": 0.0046864904941642995,
"f1": 0.3518403942953031,
"f1_stderr": 0.004613402461586294,
"acc": 0.39622289254314186,
"acc_stderr": 0.008677803422491042
},
"harness|drop|3": {
"em": 0.2985528523489933,
"em_stderr": 0.0046864904941642995,
"f1": 0.3518403942953031,
"f1_stderr": 0.004613402461586294
},
"harness|gsm8k|5": {
"acc": 0.03790750568612585,
"acc_stderr": 0.005260333907798437
},
"harness|winogrande|5": {
"acc": 0.7545382794001578,
"acc_stderr": 0.012095272937183646
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
BangumiBase/attackontitan | ---
license: mit
tags:
- art
size_categories:
- 10K<n<100K
---
# Bangumi Image Base of Attack On Titan
This is the image base of bangumi Attack On Titan, we detected 76 characters, 14308 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 1568 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 705 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 1342 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 1771 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 304 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 735 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 173 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 72 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 50 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 164 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 87 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 32 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 122 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 462 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 141 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 183 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 60 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 52 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 49 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 1082 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 57 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 587 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 224 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 140 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 110 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 26 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 581 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 86 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 60 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 141 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 59 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 534 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 64 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 173 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 22 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 32 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 133 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 230 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 94 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 44 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 46 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 48 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 102 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 36 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 36 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 51 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 51 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 52 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 55 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 51 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 23 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 70 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 66 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 77 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 37 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 20 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 33 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| 57 | 94 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 25 | [Download](58/dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 19 | [Download](59/dataset.zip) |  |  |  |  |  |  |  |  |
| 60 | 32 | [Download](60/dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 31 | [Download](61/dataset.zip) |  |  |  |  |  |  |  |  |
| 62 | 44 | [Download](62/dataset.zip) |  |  |  |  |  |  |  |  |
| 63 | 23 | [Download](63/dataset.zip) |  |  |  |  |  |  |  |  |
| 64 | 59 | [Download](64/dataset.zip) |  |  |  |  |  |  |  |  |
| 65 | 52 | [Download](65/dataset.zip) |  |  |  |  |  |  |  |  |
| 66 | 39 | [Download](66/dataset.zip) |  |  |  |  |  |  |  |  |
| 67 | 58 | [Download](67/dataset.zip) |  |  |  |  |  |  |  |  |
| 68 | 16 | [Download](68/dataset.zip) |  |  |  |  |  |  |  |  |
| 69 | 18 | [Download](69/dataset.zip) |  |  |  |  |  |  |  |  |
| 70 | 26 | [Download](70/dataset.zip) |  |  |  |  |  |  |  |  |
| 71 | 31 | [Download](71/dataset.zip) |  |  |  |  |  |  |  |  |
| 72 | 68 | [Download](72/dataset.zip) |  |  |  |  |  |  |  |  |
| 73 | 7 | [Download](73/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 74 | 10 | [Download](74/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 251 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
Amir13/ncbi-persian | ---
annotations_creators:
- expert-generated
language:
- fa
language_creators:
- machine-generated
license:
- other
multilinguality:
- monolingual
pretty_name: ncbi-persian
size_categories:
- 1K<n<10K
source_datasets:
- extended|ncbi_disease
tags:
- named entity recognition
task_categories:
- token-classification
task_ids:
- named-entity-recognition
train-eval-index:
- col_mapping:
ner_tags: target
tokens: text
config: ncbi_disease
metrics:
- name: Accuracy
type: accuracy
- args:
average: macro
name: F1 macro
type: f1
- args:
average: micro
name: F1 micro
type: f1
- args:
average: weighted
name: F1 weighted
type: f1
- args:
average: macro
name: Precision macro
type: precision
- args:
average: micro
name: Precision micro
type: precision
- args:
average: weighted
name: Precision weighted
type: precision
- args:
average: macro
name: Recall macro
type: recall
- args:
average: micro
name: Recall micro
type: recall
- args:
average: weighted
name: Recall weighted
type: recall
splits:
eval_split: test
train_split: train
task: token-classification
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
If you used the datasets and models in this repository, please cite it.
```bibtex
@misc{https://doi.org/10.48550/arxiv.2302.09611,
doi = {10.48550/ARXIV.2302.09611},
url = {https://arxiv.org/abs/2302.09611},
author = {Sartipi, Amir and Fatemi, Afsaneh},
keywords = {Computation and Language (cs.CL), Artificial Intelligence (cs.AI), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Exploring the Potential of Machine Translation for Generating Named Entity Datasets: A Case Study between Persian and English},
publisher = {arXiv},
year = {2023},
copyright = {arXiv.org perpetual, non-exclusive license}
}
```
### Contributions
[More Information Needed]
|
CyberHarem/ark_royal_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of ark_royal/アーク・ロイヤル/皇家方舟 (Azur Lane)
This is the dataset of ark_royal/アーク・ロイヤル/皇家方舟 (Azur Lane), containing 101 images and their tags.
The core tags of this character are `breasts, blue_eyes, black_hair, short_hair, large_breasts, hair_over_one_eye, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 101 | 128.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ark_royal_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 101 | 73.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ark_royal_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 225 | 143.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ark_royal_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 101 | 114.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ark_royal_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 225 | 205.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ark_royal_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ark_royal_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 25 |  |  |  |  |  | 1girl, cleavage, solo, looking_at_viewer, skirt, garter_straps, black_thighhighs, white_gloves, smile, gun, holding_weapon, standing, white_background |
| 1 | 5 |  |  |  |  |  | 1girl, solo, upper_body, cleavage_cutout, jacket, looking_at_viewer, simple_background, smile, open_mouth, dated, long_sleeves, shirt, white_background |
| 2 | 13 |  |  |  |  |  | 1girl, black_dress, cleavage, solo, halter_dress, looking_at_viewer, black_gloves, half_gloves, sleeveless_dress, bracelet, closed_mouth, jacket_on_shoulders, smile, bare_shoulders, official_alternate_costume, sitting, thighs, coat, collarbone, earrings, bare_legs, black_footwear, full_body, high_heels, holding, white_background |
| 3 | 14 |  |  |  |  |  | 1girl, blue_bikini, cleavage, smile, solo, looking_at_viewer, navel, choker, sunglasses, collarbone, bare_shoulders, day, earrings, ocean, outdoors, sitting, blue_sky, umbrella, beach, bracelet, cloud, cowboy_shot, holding, o-ring_bikini, stomach, thigh_strap |
| 4 | 7 |  |  |  |  |  | 1girl, black_pants, smile, white_shirt, black_choker, cleavage, collarbone, collared_shirt, long_sleeves, looking_at_viewer, solo, belt, black_bra, black_necktie, closed_mouth, black_corset, black_footwear, boots, parted_lips, sitting, underbust, weapon |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | solo | looking_at_viewer | skirt | garter_straps | black_thighhighs | white_gloves | smile | gun | holding_weapon | standing | white_background | upper_body | cleavage_cutout | jacket | simple_background | open_mouth | dated | long_sleeves | shirt | black_dress | halter_dress | black_gloves | half_gloves | sleeveless_dress | bracelet | closed_mouth | jacket_on_shoulders | bare_shoulders | official_alternate_costume | sitting | thighs | coat | collarbone | earrings | bare_legs | black_footwear | full_body | high_heels | holding | blue_bikini | navel | choker | sunglasses | day | ocean | outdoors | blue_sky | umbrella | beach | cloud | cowboy_shot | o-ring_bikini | stomach | thigh_strap | black_pants | white_shirt | black_choker | collared_shirt | belt | black_bra | black_necktie | black_corset | boots | parted_lips | underbust | weapon |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:-------|:--------------------|:--------|:----------------|:-------------------|:---------------|:--------|:------|:-----------------|:-----------|:-------------------|:-------------|:------------------|:---------|:--------------------|:-------------|:--------|:---------------|:--------|:--------------|:---------------|:---------------|:--------------|:-------------------|:-----------|:---------------|:----------------------|:-----------------|:-----------------------------|:----------|:---------|:-------|:-------------|:-----------|:------------|:-----------------|:------------|:-------------|:----------|:--------------|:--------|:---------|:-------------|:------|:--------|:-----------|:-----------|:-----------|:--------|:--------|:--------------|:----------------|:----------|:--------------|:--------------|:--------------|:---------------|:-----------------|:-------|:------------|:----------------|:---------------|:--------|:--------------|:------------|:---------|
| 0 | 25 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | X | X | | | | | X | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 13 |  |  |  |  |  | X | X | X | X | | | | | X | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 14 |  |  |  |  |  | X | X | X | X | | | | | X | | | | | | | | | | | | | | | | | | X | | | X | | X | | | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | X | X | X | | | | | X | | | | | | | | | | | X | | | | | | | | X | | | | X | | | X | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
khondoker/SentNoB | ---
language:
- bn
task_categories:
- text-classification
pretty_name: SentNoB
task_ids:
- sentiment-classification
annotations_creators:
- expert-generated
language_creators:
- expert-generated
paperswithcode_id: sentnob
---
# Dataset Card for "SentNoB"
### Dataset Summary
Social Media User Comments' Sentiment Analysis Dataset. Each user comments are labeled with either positive (1), negative (2), or neutral (0).
### Citation Information
```
@inproceedings{islam2021sentnob,
title={SentNoB: A Dataset for Analysing Sentiment on Noisy Bangla Texts},
author={Islam, Khondoker Ittehadul and Kar, Sudipta and Islam, Md Saiful and Amin, Mohammad Ruhul},
booktitle={Findings of the Association for Computational Linguistics: EMNLP 2021},
pages={3265--3271},
year={2021}
}
``` |
RealTimeData/News_Seq_2021 | ---
dataset_info:
features:
- name: authors
sequence: string
- name: date_download
dtype: string
- name: date_modify
dtype: string
- name: date_publish
dtype: string
- name: description
dtype: string
- name: filename
dtype: string
- name: image_url
dtype: string
- name: language
dtype: string
- name: localpath
dtype: string
- name: maintext
dtype: string
- name: source_domain
dtype: string
- name: title
dtype: string
- name: title_page
dtype: string
- name: title_rss
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 16944979
num_examples: 4252
download_size: 8112201
dataset_size: 16944979
---
# Dataset Card for "News_Seq_2021"
This dataset was constructed at 1 Seq 2021, which contains news published from 10 June 2021 to 21 Aug 2021 from various sources.
All news articles in this dataset are in English.
Created from `commoncrawl`. |
cissedelocht/timingdiagrams | ---
license: mit
---
|
CyberHarem/flamme_sousounofrieren | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Flamme/フランメ (Sousou no Frieren)
This is the dataset of Flamme/フランメ (Sousou no Frieren), containing 74 images and their tags.
The core tags of this character are `long_hair, green_eyes, brown_hair, orange_hair, earrings, braid`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 74 | 57.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/flamme_sousounofrieren/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 74 | 57.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/flamme_sousounofrieren/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 136 | 99.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/flamme_sousounofrieren/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/flamme_sousounofrieren',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, collarbone, solo, upper_body, closed_mouth, smile, looking_at_viewer, outdoors, white_shirt, forest, jewelry |
| 1 | 11 |  |  |  |  |  | 1girl, collarbone, solo, armlet, white_dress, hair_over_one_eye, upper_body, closed_mouth, short_sleeves, bracelet, smile, breasts, choker, closed_eyes, red_hair |
| 2 | 6 |  |  |  |  |  | 1girl, blurry_background, jewelry, profile, solo, choker, from_side, portrait, open_mouth, upper_teeth_only |
| 3 | 7 |  |  |  |  |  | 1girl, blurry_background, smile, open_mouth, solo_focus, white_shirt, 1boy, out_of_frame, sidelocks, teeth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | collarbone | solo | upper_body | closed_mouth | smile | looking_at_viewer | outdoors | white_shirt | forest | jewelry | armlet | white_dress | hair_over_one_eye | short_sleeves | bracelet | breasts | choker | closed_eyes | red_hair | blurry_background | profile | from_side | portrait | open_mouth | upper_teeth_only | solo_focus | 1boy | out_of_frame | sidelocks | teeth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:-------|:-------------|:---------------|:--------|:--------------------|:-----------|:--------------|:---------|:----------|:---------|:--------------|:--------------------|:----------------|:-----------|:----------|:---------|:--------------|:-----------|:--------------------|:----------|:------------|:-----------|:-------------|:-------------------|:-------------|:-------|:---------------|:------------|:--------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | X | X | X | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | X | | | | | | | | X | | | | | | | X | | | X | X | X | X | X | X | | | | | |
| 3 | 7 |  |  |  |  |  | X | | | | | X | | | X | | | | | | | | | | | | X | | | | X | | X | X | X | X | X |
|
open-llm-leaderboard/details_ewqr2130__llama2-7b-raw-sft | ---
pretty_name: Evaluation run of ewqr2130/llama2-7b-raw-sft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ewqr2130/llama2-7b-raw-sft](https://huggingface.co/ewqr2130/llama2-7b-raw-sft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ewqr2130__llama2-7b-raw-sft\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-10T15:15:16.030532](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__llama2-7b-raw-sft/blob/main/results_2024-01-10T15-15-16.030532.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3451686041304389,\n\
\ \"acc_stderr\": 0.033177024770114395,\n \"acc_norm\": 0.34794617103590064,\n\
\ \"acc_norm_stderr\": 0.033992606612009306,\n \"mc1\": 0.2521419828641371,\n\
\ \"mc1_stderr\": 0.015201522246299963,\n \"mc2\": 0.4077071941467522,\n\
\ \"mc2_stderr\": 0.014214727907656348\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.43430034129692835,\n \"acc_stderr\": 0.01448470304885736,\n\
\ \"acc_norm\": 0.47440273037542663,\n \"acc_norm_stderr\": 0.014592230885298964\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5518820952001593,\n\
\ \"acc_stderr\": 0.004962846206125493,\n \"acc_norm\": 0.7525393347938658,\n\
\ \"acc_norm_stderr\": 0.004306547156331412\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.37037037037037035,\n\
\ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.37037037037037035,\n\
\ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.28289473684210525,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.28289473684210525,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.41509433962264153,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.41509433962264153,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3402777777777778,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.3402777777777778,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.32947976878612717,\n\
\ \"acc_stderr\": 0.035839017547364106,\n \"acc_norm\": 0.32947976878612717,\n\
\ \"acc_norm_stderr\": 0.035839017547364106\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610334,\n\
\ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610334\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n\
\ \"acc_stderr\": 0.03395490020856112,\n \"acc_norm\": 0.1746031746031746,\n\
\ \"acc_norm_stderr\": 0.03395490020856112\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.4258064516129032,\n \"acc_stderr\": 0.0281291127091659,\n \"acc_norm\"\
: 0.4258064516129032,\n \"acc_norm_stderr\": 0.0281291127091659\n },\n\
\ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3103448275862069,\n\
\ \"acc_stderr\": 0.032550867699701024,\n \"acc_norm\": 0.3103448275862069,\n\
\ \"acc_norm_stderr\": 0.032550867699701024\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \
\ \"acc\": 0.44242424242424244,\n \"acc_stderr\": 0.03878372113711275,\n\
\ \"acc_norm\": 0.44242424242424244,\n \"acc_norm_stderr\": 0.03878372113711275\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35858585858585856,\n \"acc_stderr\": 0.03416903640391521,\n \"\
acc_norm\": 0.35858585858585856,\n \"acc_norm_stderr\": 0.03416903640391521\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.49222797927461137,\n \"acc_stderr\": 0.036080032255696545,\n\
\ \"acc_norm\": 0.49222797927461137,\n \"acc_norm_stderr\": 0.036080032255696545\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3384615384615385,\n \"acc_stderr\": 0.02399150050031304,\n \
\ \"acc_norm\": 0.3384615384615385,\n \"acc_norm_stderr\": 0.02399150050031304\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.36134453781512604,\n \"acc_stderr\": 0.031204691225150013,\n\
\ \"acc_norm\": 0.36134453781512604,\n \"acc_norm_stderr\": 0.031204691225150013\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3724770642201835,\n \"acc_stderr\": 0.020728368457638494,\n \"\
acc_norm\": 0.3724770642201835,\n \"acc_norm_stderr\": 0.020728368457638494\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4019607843137255,\n \"acc_stderr\": 0.034411900234824655,\n \"\
acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.034411900234824655\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3755274261603376,\n \"acc_stderr\": 0.03152256243091156,\n \
\ \"acc_norm\": 0.3755274261603376,\n \"acc_norm_stderr\": 0.03152256243091156\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.33183856502242154,\n\
\ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.33183856502242154,\n\
\ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.35877862595419846,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.35877862595419846,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.39669421487603307,\n \"acc_stderr\": 0.04465869780531009,\n \"\
acc_norm\": 0.39669421487603307,\n \"acc_norm_stderr\": 0.04465869780531009\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3611111111111111,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.3611111111111111,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3312883435582822,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.3312883435582822,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.17857142857142858,\n\
\ \"acc_stderr\": 0.036352091215778065,\n \"acc_norm\": 0.17857142857142858,\n\
\ \"acc_norm_stderr\": 0.036352091215778065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.36893203883495146,\n \"acc_stderr\": 0.04777615181156739,\n\
\ \"acc_norm\": 0.36893203883495146,\n \"acc_norm_stderr\": 0.04777615181156739\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5085470085470085,\n\
\ \"acc_stderr\": 0.0327513030009703,\n \"acc_norm\": 0.5085470085470085,\n\
\ \"acc_norm_stderr\": 0.0327513030009703\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4240102171136654,\n\
\ \"acc_stderr\": 0.017672263329084226,\n \"acc_norm\": 0.4240102171136654,\n\
\ \"acc_norm_stderr\": 0.017672263329084226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.028180596328259293,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.028180596328259293\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.34726688102893893,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.34726688102893893,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.31790123456790126,\n \"acc_stderr\": 0.02591006352824088,\n\
\ \"acc_norm\": 0.31790123456790126,\n \"acc_norm_stderr\": 0.02591006352824088\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.28368794326241137,\n \"acc_stderr\": 0.02689170942834396,\n \
\ \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.02689170942834396\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2940026075619296,\n\
\ \"acc_stderr\": 0.011636062953698604,\n \"acc_norm\": 0.2940026075619296,\n\
\ \"acc_norm_stderr\": 0.011636062953698604\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4632352941176471,\n \"acc_stderr\": 0.030290619180485687,\n\
\ \"acc_norm\": 0.4632352941176471,\n \"acc_norm_stderr\": 0.030290619180485687\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.28431372549019607,\n \"acc_stderr\": 0.018249024411207668,\n \
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.018249024411207668\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.42727272727272725,\n\
\ \"acc_stderr\": 0.04738198703545483,\n \"acc_norm\": 0.42727272727272725,\n\
\ \"acc_norm_stderr\": 0.04738198703545483\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3877551020408163,\n \"acc_stderr\": 0.031192230726795656,\n\
\ \"acc_norm\": 0.3877551020408163,\n \"acc_norm_stderr\": 0.031192230726795656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.43283582089552236,\n\
\ \"acc_stderr\": 0.03503490923673281,\n \"acc_norm\": 0.43283582089552236,\n\
\ \"acc_norm_stderr\": 0.03503490923673281\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3614457831325301,\n\
\ \"acc_stderr\": 0.0374005938202932,\n \"acc_norm\": 0.3614457831325301,\n\
\ \"acc_norm_stderr\": 0.0374005938202932\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3742690058479532,\n \"acc_stderr\": 0.03711601185389481,\n\
\ \"acc_norm\": 0.3742690058479532,\n \"acc_norm_stderr\": 0.03711601185389481\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2521419828641371,\n\
\ \"mc1_stderr\": 0.015201522246299963,\n \"mc2\": 0.4077071941467522,\n\
\ \"mc2_stderr\": 0.014214727907656348\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7300710339384373,\n \"acc_stderr\": 0.012476433372002608\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.037149355572403335,\n \
\ \"acc_stderr\": 0.005209516283073736\n }\n}\n```"
repo_url: https://huggingface.co/ewqr2130/llama2-7b-raw-sft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|arc:challenge|25_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|gsm8k|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hellaswag|10_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T15-15-16.030532.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T15-15-16.030532.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- '**/details_harness|winogrande|5_2024-01-10T15-15-16.030532.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-10T15-15-16.030532.parquet'
- config_name: results
data_files:
- split: 2024_01_10T15_15_16.030532
path:
- results_2024-01-10T15-15-16.030532.parquet
- split: latest
path:
- results_2024-01-10T15-15-16.030532.parquet
---
# Dataset Card for Evaluation run of ewqr2130/llama2-7b-raw-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ewqr2130/llama2-7b-raw-sft](https://huggingface.co/ewqr2130/llama2-7b-raw-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ewqr2130__llama2-7b-raw-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-10T15:15:16.030532](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__llama2-7b-raw-sft/blob/main/results_2024-01-10T15-15-16.030532.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3451686041304389,
"acc_stderr": 0.033177024770114395,
"acc_norm": 0.34794617103590064,
"acc_norm_stderr": 0.033992606612009306,
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299963,
"mc2": 0.4077071941467522,
"mc2_stderr": 0.014214727907656348
},
"harness|arc:challenge|25": {
"acc": 0.43430034129692835,
"acc_stderr": 0.01448470304885736,
"acc_norm": 0.47440273037542663,
"acc_norm_stderr": 0.014592230885298964
},
"harness|hellaswag|10": {
"acc": 0.5518820952001593,
"acc_stderr": 0.004962846206125493,
"acc_norm": 0.7525393347938658,
"acc_norm_stderr": 0.004306547156331412
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.28289473684210525,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.28289473684210525,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.41509433962264153,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.41509433962264153,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3402777777777778,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.3402777777777778,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.035839017547364106,
"acc_norm": 0.32947976878612717,
"acc_norm_stderr": 0.035839017547364106
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.030579442773610334,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.030579442773610334
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.03395490020856112,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.03395490020856112
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4258064516129032,
"acc_stderr": 0.0281291127091659,
"acc_norm": 0.4258064516129032,
"acc_norm_stderr": 0.0281291127091659
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.032550867699701024,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.032550867699701024
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.44242424242424244,
"acc_stderr": 0.03878372113711275,
"acc_norm": 0.44242424242424244,
"acc_norm_stderr": 0.03878372113711275
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35858585858585856,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.35858585858585856,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.49222797927461137,
"acc_stderr": 0.036080032255696545,
"acc_norm": 0.49222797927461137,
"acc_norm_stderr": 0.036080032255696545
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3384615384615385,
"acc_stderr": 0.02399150050031304,
"acc_norm": 0.3384615384615385,
"acc_norm_stderr": 0.02399150050031304
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36134453781512604,
"acc_stderr": 0.031204691225150013,
"acc_norm": 0.36134453781512604,
"acc_norm_stderr": 0.031204691225150013
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3724770642201835,
"acc_stderr": 0.020728368457638494,
"acc_norm": 0.3724770642201835,
"acc_norm_stderr": 0.020728368457638494
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.034411900234824655,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.034411900234824655
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3755274261603376,
"acc_stderr": 0.03152256243091156,
"acc_norm": 0.3755274261603376,
"acc_norm_stderr": 0.03152256243091156
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.33183856502242154,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.33183856502242154,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.35877862595419846,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.35877862595419846,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.39669421487603307,
"acc_stderr": 0.04465869780531009,
"acc_norm": 0.39669421487603307,
"acc_norm_stderr": 0.04465869780531009
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3312883435582822,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.3312883435582822,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.17857142857142858,
"acc_stderr": 0.036352091215778065,
"acc_norm": 0.17857142857142858,
"acc_norm_stderr": 0.036352091215778065
},
"harness|hendrycksTest-management|5": {
"acc": 0.36893203883495146,
"acc_stderr": 0.04777615181156739,
"acc_norm": 0.36893203883495146,
"acc_norm_stderr": 0.04777615181156739
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5085470085470085,
"acc_stderr": 0.0327513030009703,
"acc_norm": 0.5085470085470085,
"acc_norm_stderr": 0.0327513030009703
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4240102171136654,
"acc_stderr": 0.017672263329084226,
"acc_norm": 0.4240102171136654,
"acc_norm_stderr": 0.017672263329084226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.028180596328259293,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.028180596328259293
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.34726688102893893,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.34726688102893893,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.31790123456790126,
"acc_stderr": 0.02591006352824088,
"acc_norm": 0.31790123456790126,
"acc_norm_stderr": 0.02591006352824088
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.28368794326241137,
"acc_stderr": 0.02689170942834396,
"acc_norm": 0.28368794326241137,
"acc_norm_stderr": 0.02689170942834396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2940026075619296,
"acc_stderr": 0.011636062953698604,
"acc_norm": 0.2940026075619296,
"acc_norm_stderr": 0.011636062953698604
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4632352941176471,
"acc_stderr": 0.030290619180485687,
"acc_norm": 0.4632352941176471,
"acc_norm_stderr": 0.030290619180485687
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.018249024411207668,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.018249024411207668
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.42727272727272725,
"acc_stderr": 0.04738198703545483,
"acc_norm": 0.42727272727272725,
"acc_norm_stderr": 0.04738198703545483
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3877551020408163,
"acc_stderr": 0.031192230726795656,
"acc_norm": 0.3877551020408163,
"acc_norm_stderr": 0.031192230726795656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.43283582089552236,
"acc_stderr": 0.03503490923673281,
"acc_norm": 0.43283582089552236,
"acc_norm_stderr": 0.03503490923673281
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3614457831325301,
"acc_stderr": 0.0374005938202932,
"acc_norm": 0.3614457831325301,
"acc_norm_stderr": 0.0374005938202932
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3742690058479532,
"acc_stderr": 0.03711601185389481,
"acc_norm": 0.3742690058479532,
"acc_norm_stderr": 0.03711601185389481
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299963,
"mc2": 0.4077071941467522,
"mc2_stderr": 0.014214727907656348
},
"harness|winogrande|5": {
"acc": 0.7300710339384373,
"acc_stderr": 0.012476433372002608
},
"harness|gsm8k|5": {
"acc": 0.037149355572403335,
"acc_stderr": 0.005209516283073736
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
joey234/mmlu-human_sexuality-neg-prepend-fix | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
splits:
- name: dev
num_bytes: 5197
num_examples: 5
- name: test
num_bytes: 288820
num_examples: 131
download_size: 13461
dataset_size: 294017
---
# Dataset Card for "mmlu-human_sexuality-neg-prepend-fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jbrophy123/quora_dataset | ---
dataset_info:
features:
- name: chat_sample
dtype: string
- name: dataset_origin
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 6540045
num_examples: 5000
download_size: 0
dataset_size: 6540045
---
# Dataset Card for "quora_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_walebadr__Mistral-7B-v0.1-DPO | ---
pretty_name: Evaluation run of walebadr/Mistral-7B-v0.1-DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [walebadr/Mistral-7B-v0.1-DPO](https://huggingface.co/walebadr/Mistral-7B-v0.1-DPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_walebadr__Mistral-7B-v0.1-DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-13T18:02:23.868441](https://huggingface.co/datasets/open-llm-leaderboard/details_walebadr__Mistral-7B-v0.1-DPO/blob/main/results_2024-01-13T18-02-23.868441.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2513839168298002,\n\
\ \"acc_stderr\": 0.03077453939218842,\n \"acc_norm\": 0.2517964377923722,\n\
\ \"acc_norm_stderr\": 0.03159254911508562,\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.4935990954197777,\n\
\ \"mc2_stderr\": 0.017220011527240037\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.23464163822525597,\n \"acc_stderr\": 0.012383873560768673,\n\
\ \"acc_norm\": 0.2781569965870307,\n \"acc_norm_stderr\": 0.0130944699195388\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2562238597888867,\n\
\ \"acc_stderr\": 0.004356547185847042,\n \"acc_norm\": 0.2622983469428401,\n\
\ \"acc_norm_stderr\": 0.004389849907040314\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.0359144408419697,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.0359144408419697\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.28289473684210525,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.28289473684210525,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816503,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816503\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.22641509433962265,\n \"acc_stderr\": 0.025757559893106765,\n\
\ \"acc_norm\": 0.22641509433962265,\n \"acc_norm_stderr\": 0.025757559893106765\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.20833333333333334,\n\
\ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.20833333333333334,\n\
\ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24855491329479767,\n\
\ \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.24855491329479767,\n\
\ \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.33617021276595743,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.33617021276595743,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022057,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022057\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948368,\n \"\
acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948368\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.25161290322580643,\n \"acc_stderr\": 0.024685979286239956,\n \"\
acc_norm\": 0.25161290322580643,\n \"acc_norm_stderr\": 0.024685979286239956\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2019704433497537,\n \"acc_stderr\": 0.028247350122180253,\n \"\
acc_norm\": 0.2019704433497537,\n \"acc_norm_stderr\": 0.028247350122180253\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.20606060606060606,\n \"acc_stderr\": 0.031584153240477086,\n\
\ \"acc_norm\": 0.20606060606060606,\n \"acc_norm_stderr\": 0.031584153240477086\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2676767676767677,\n \"acc_stderr\": 0.03154449888270286,\n \"\
acc_norm\": 0.2676767676767677,\n \"acc_norm_stderr\": 0.03154449888270286\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2694300518134715,\n \"acc_stderr\": 0.03201867122877794,\n\
\ \"acc_norm\": 0.2694300518134715,\n \"acc_norm_stderr\": 0.03201867122877794\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2641025641025641,\n \"acc_stderr\": 0.02235219373745327,\n \
\ \"acc_norm\": 0.2641025641025641,\n \"acc_norm_stderr\": 0.02235219373745327\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23703703703703705,\n \"acc_stderr\": 0.02592887613276611,\n \
\ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.02592887613276611\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.030388353551886845,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.030388353551886845\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.0347918557259966,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.0347918557259966\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.20550458715596331,\n \"acc_stderr\": 0.017324352325016012,\n \"\
acc_norm\": 0.20550458715596331,\n \"acc_norm_stderr\": 0.017324352325016012\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.32407407407407407,\n \"acc_stderr\": 0.03191923445686185,\n \"\
acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.03191923445686185\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.22549019607843138,\n \"acc_stderr\": 0.029331162294251742,\n \"\
acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.029331162294251742\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2869198312236287,\n \"acc_stderr\": 0.029443773022594693,\n \
\ \"acc_norm\": 0.2869198312236287,\n \"acc_norm_stderr\": 0.029443773022594693\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.25112107623318386,\n\
\ \"acc_stderr\": 0.02910522083322462,\n \"acc_norm\": 0.25112107623318386,\n\
\ \"acc_norm_stderr\": 0.02910522083322462\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2975206611570248,\n \"acc_stderr\": 0.041733491480834994,\n \"\
acc_norm\": 0.2975206611570248,\n \"acc_norm_stderr\": 0.041733491480834994\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.15337423312883436,\n \"acc_stderr\": 0.02831160144143859,\n\
\ \"acc_norm\": 0.15337423312883436,\n \"acc_norm_stderr\": 0.02831160144143859\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.21359223300970873,\n \"acc_stderr\": 0.04058042015646036,\n\
\ \"acc_norm\": 0.21359223300970873,\n \"acc_norm_stderr\": 0.04058042015646036\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23504273504273504,\n\
\ \"acc_stderr\": 0.027778835904935423,\n \"acc_norm\": 0.23504273504273504,\n\
\ \"acc_norm_stderr\": 0.027778835904935423\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2388250319284802,\n\
\ \"acc_stderr\": 0.015246803197398687,\n \"acc_norm\": 0.2388250319284802,\n\
\ \"acc_norm_stderr\": 0.015246803197398687\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n\
\ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.01433352205921789,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.01433352205921789\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.02555316999182652,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.02555316999182652\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.22186495176848875,\n\
\ \"acc_stderr\": 0.02359885829286305,\n \"acc_norm\": 0.22186495176848875,\n\
\ \"acc_norm_stderr\": 0.02359885829286305\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24382716049382716,\n \"acc_stderr\": 0.023891879541959617,\n\
\ \"acc_norm\": 0.24382716049382716,\n \"acc_norm_stderr\": 0.023891879541959617\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2765957446808511,\n \"acc_stderr\": 0.026684564340460997,\n \
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.026684564340460997\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25749674054758803,\n\
\ \"acc_stderr\": 0.011167706014904143,\n \"acc_norm\": 0.25749674054758803,\n\
\ \"acc_norm_stderr\": 0.011167706014904143\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.23161764705882354,\n \"acc_stderr\": 0.025626533803777562,\n\
\ \"acc_norm\": 0.23161764705882354,\n \"acc_norm_stderr\": 0.025626533803777562\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.19607843137254902,\n \"acc_stderr\": 0.01606205642196865,\n \
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.01606205642196865\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.18181818181818182,\n\
\ \"acc_stderr\": 0.036942843353377997,\n \"acc_norm\": 0.18181818181818182,\n\
\ \"acc_norm_stderr\": 0.036942843353377997\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788167,\n\
\ \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788167\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2885572139303483,\n\
\ \"acc_stderr\": 0.03203841040213321,\n \"acc_norm\": 0.2885572139303483,\n\
\ \"acc_norm_stderr\": 0.03203841040213321\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25903614457831325,\n\
\ \"acc_stderr\": 0.034106466140718564,\n \"acc_norm\": 0.25903614457831325,\n\
\ \"acc_norm_stderr\": 0.034106466140718564\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.23391812865497075,\n \"acc_stderr\": 0.03246721765117827,\n\
\ \"acc_norm\": 0.23391812865497075,\n \"acc_norm_stderr\": 0.03246721765117827\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.4935990954197777,\n\
\ \"mc2_stderr\": 0.017220011527240037\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5280189423835833,\n \"acc_stderr\": 0.014030404213405786\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/walebadr/Mistral-7B-v0.1-DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|arc:challenge|25_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|arc:challenge|25_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|gsm8k|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|gsm8k|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hellaswag|10_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hellaswag|10_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T20-13-45.405599.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T18-02-23.868441.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T18-02-23.868441.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- '**/details_harness|winogrande|5_2024-01-10T20-13-45.405599.parquet'
- split: 2024_01_13T18_02_23.868441
path:
- '**/details_harness|winogrande|5_2024-01-13T18-02-23.868441.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-13T18-02-23.868441.parquet'
- config_name: results
data_files:
- split: 2024_01_10T20_13_45.405599
path:
- results_2024-01-10T20-13-45.405599.parquet
- split: 2024_01_13T18_02_23.868441
path:
- results_2024-01-13T18-02-23.868441.parquet
- split: latest
path:
- results_2024-01-13T18-02-23.868441.parquet
---
# Dataset Card for Evaluation run of walebadr/Mistral-7B-v0.1-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [walebadr/Mistral-7B-v0.1-DPO](https://huggingface.co/walebadr/Mistral-7B-v0.1-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_walebadr__Mistral-7B-v0.1-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T18:02:23.868441](https://huggingface.co/datasets/open-llm-leaderboard/details_walebadr__Mistral-7B-v0.1-DPO/blob/main/results_2024-01-13T18-02-23.868441.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2513839168298002,
"acc_stderr": 0.03077453939218842,
"acc_norm": 0.2517964377923722,
"acc_norm_stderr": 0.03159254911508562,
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.4935990954197777,
"mc2_stderr": 0.017220011527240037
},
"harness|arc:challenge|25": {
"acc": 0.23464163822525597,
"acc_stderr": 0.012383873560768673,
"acc_norm": 0.2781569965870307,
"acc_norm_stderr": 0.0130944699195388
},
"harness|hellaswag|10": {
"acc": 0.2562238597888867,
"acc_stderr": 0.004356547185847042,
"acc_norm": 0.2622983469428401,
"acc_norm_stderr": 0.004389849907040314
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.0359144408419697,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.0359144408419697
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.28289473684210525,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.28289473684210525,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816503,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816503
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22641509433962265,
"acc_stderr": 0.025757559893106765,
"acc_norm": 0.22641509433962265,
"acc_norm_stderr": 0.025757559893106765
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33617021276595743,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.33617021276595743,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022057,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022057
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.022182037202948368,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.022182037202948368
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.024685979286239956,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.024685979286239956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2019704433497537,
"acc_stderr": 0.028247350122180253,
"acc_norm": 0.2019704433497537,
"acc_norm_stderr": 0.028247350122180253
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.20606060606060606,
"acc_stderr": 0.031584153240477086,
"acc_norm": 0.20606060606060606,
"acc_norm_stderr": 0.031584153240477086
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2676767676767677,
"acc_stderr": 0.03154449888270286,
"acc_norm": 0.2676767676767677,
"acc_norm_stderr": 0.03154449888270286
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2694300518134715,
"acc_stderr": 0.03201867122877794,
"acc_norm": 0.2694300518134715,
"acc_norm_stderr": 0.03201867122877794
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2641025641025641,
"acc_stderr": 0.02235219373745327,
"acc_norm": 0.2641025641025641,
"acc_norm_stderr": 0.02235219373745327
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.02592887613276611,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.02592887613276611
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.030388353551886845,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.030388353551886845
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.0347918557259966,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.0347918557259966
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.20550458715596331,
"acc_stderr": 0.017324352325016012,
"acc_norm": 0.20550458715596331,
"acc_norm_stderr": 0.017324352325016012
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.03191923445686185,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.03191923445686185
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.029331162294251742,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.029331162294251742
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2869198312236287,
"acc_stderr": 0.029443773022594693,
"acc_norm": 0.2869198312236287,
"acc_norm_stderr": 0.029443773022594693
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.25112107623318386,
"acc_stderr": 0.02910522083322462,
"acc_norm": 0.25112107623318386,
"acc_norm_stderr": 0.02910522083322462
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2975206611570248,
"acc_stderr": 0.041733491480834994,
"acc_norm": 0.2975206611570248,
"acc_norm_stderr": 0.041733491480834994
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.15337423312883436,
"acc_stderr": 0.02831160144143859,
"acc_norm": 0.15337423312883436,
"acc_norm_stderr": 0.02831160144143859
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.21359223300970873,
"acc_stderr": 0.04058042015646036,
"acc_norm": 0.21359223300970873,
"acc_norm_stderr": 0.04058042015646036
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23504273504273504,
"acc_stderr": 0.027778835904935423,
"acc_norm": 0.23504273504273504,
"acc_norm_stderr": 0.027778835904935423
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2388250319284802,
"acc_stderr": 0.015246803197398687,
"acc_norm": 0.2388250319284802,
"acc_norm_stderr": 0.015246803197398687
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.01433352205921789,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.01433352205921789
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.02555316999182652,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.02555316999182652
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.22186495176848875,
"acc_stderr": 0.02359885829286305,
"acc_norm": 0.22186495176848875,
"acc_norm_stderr": 0.02359885829286305
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24382716049382716,
"acc_stderr": 0.023891879541959617,
"acc_norm": 0.24382716049382716,
"acc_norm_stderr": 0.023891879541959617
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.026684564340460997,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.026684564340460997
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25749674054758803,
"acc_stderr": 0.011167706014904143,
"acc_norm": 0.25749674054758803,
"acc_norm_stderr": 0.011167706014904143
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.23161764705882354,
"acc_stderr": 0.025626533803777562,
"acc_norm": 0.23161764705882354,
"acc_norm_stderr": 0.025626533803777562
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.01606205642196865,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.01606205642196865
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.036942843353377997,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.036942843353377997
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788167,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788167
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2885572139303483,
"acc_stderr": 0.03203841040213321,
"acc_norm": 0.2885572139303483,
"acc_norm_stderr": 0.03203841040213321
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25903614457831325,
"acc_stderr": 0.034106466140718564,
"acc_norm": 0.25903614457831325,
"acc_norm_stderr": 0.034106466140718564
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23391812865497075,
"acc_stderr": 0.03246721765117827,
"acc_norm": 0.23391812865497075,
"acc_norm_stderr": 0.03246721765117827
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.4935990954197777,
"mc2_stderr": 0.017220011527240037
},
"harness|winogrande|5": {
"acc": 0.5280189423835833,
"acc_stderr": 0.014030404213405786
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_jisukim8873__mistral-7B-alpaca-case-0-2 | ---
pretty_name: Evaluation run of jisukim8873/mistral-7B-alpaca-case-0-2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jisukim8873/mistral-7B-alpaca-case-0-2](https://huggingface.co/jisukim8873/mistral-7B-alpaca-case-0-2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jisukim8873__mistral-7B-alpaca-case-0-2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-02T15:26:32.124821](https://huggingface.co/datasets/open-llm-leaderboard/details_jisukim8873__mistral-7B-alpaca-case-0-2/blob/main/results_2024-04-02T15-26-32.124821.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5962447605325681,\n\
\ \"acc_stderr\": 0.033197808416410186,\n \"acc_norm\": 0.6039318228216791,\n\
\ \"acc_norm_stderr\": 0.033919783529625006,\n \"mc1\": 0.31456548347613217,\n\
\ \"mc1_stderr\": 0.016255241993179178,\n \"mc2\": 0.4356364417645542,\n\
\ \"mc2_stderr\": 0.015234232689748118\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5802047781569966,\n \"acc_stderr\": 0.01442218122630303,\n\
\ \"acc_norm\": 0.6168941979522184,\n \"acc_norm_stderr\": 0.014206472661672877\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6339374626568413,\n\
\ \"acc_stderr\": 0.004807423343224584,\n \"acc_norm\": 0.8173670583549094,\n\
\ \"acc_norm_stderr\": 0.0038557568514415415\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720685,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720685\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.042849586397534015,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.042849586397534015\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.02906722014664483,\n\
\ \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.02906722014664483\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6774193548387096,\n \"acc_stderr\": 0.026593084516572277,\n \"\
acc_norm\": 0.6774193548387096,\n \"acc_norm_stderr\": 0.026593084516572277\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"\
acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5948717948717949,\n \"acc_stderr\": 0.024890471769938145,\n\
\ \"acc_norm\": 0.5948717948717949,\n \"acc_norm_stderr\": 0.024890471769938145\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394849,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394849\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7761467889908257,\n \"acc_stderr\": 0.017871217767790215,\n \"\
acc_norm\": 0.7761467889908257,\n \"acc_norm_stderr\": 0.017871217767790215\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604257,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604257\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6962025316455697,\n \"acc_stderr\": 0.0299366963871386,\n \
\ \"acc_norm\": 0.6962025316455697,\n \"acc_norm_stderr\": 0.0299366963871386\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847836,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847836\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7918263090676884,\n\
\ \"acc_stderr\": 0.014518592248904033,\n \"acc_norm\": 0.7918263090676884,\n\
\ \"acc_norm_stderr\": 0.014518592248904033\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977247,\n\
\ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977247\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3329608938547486,\n\
\ \"acc_stderr\": 0.015761716178397556,\n \"acc_norm\": 0.3329608938547486,\n\
\ \"acc_norm_stderr\": 0.015761716178397556\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n\
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.025842248700902168,\n\
\ \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.025842248700902168\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.029494827600144373,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.029494827600144373\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41264667535853977,\n\
\ \"acc_stderr\": 0.012573836633799013,\n \"acc_norm\": 0.41264667535853977,\n\
\ \"acc_norm_stderr\": 0.012573836633799013\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.029573269134411124,\n\
\ \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.029573269134411124\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5964052287581699,\n \"acc_stderr\": 0.019848280168401154,\n \
\ \"acc_norm\": 0.5964052287581699,\n \"acc_norm_stderr\": 0.019848280168401154\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726492,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726492\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036622,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036622\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368032,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368032\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31456548347613217,\n\
\ \"mc1_stderr\": 0.016255241993179178,\n \"mc2\": 0.4356364417645542,\n\
\ \"mc2_stderr\": 0.015234232689748118\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836682\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.18953752843062927,\n \
\ \"acc_stderr\": 0.010795837931896387\n }\n}\n```"
repo_url: https://huggingface.co/jisukim8873/mistral-7B-alpaca-case-0-2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|arc:challenge|25_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|gsm8k|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hellaswag|10_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T15-26-32.124821.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T15-26-32.124821.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- '**/details_harness|winogrande|5_2024-04-02T15-26-32.124821.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-02T15-26-32.124821.parquet'
- config_name: results
data_files:
- split: 2024_04_02T15_26_32.124821
path:
- results_2024-04-02T15-26-32.124821.parquet
- split: latest
path:
- results_2024-04-02T15-26-32.124821.parquet
---
# Dataset Card for Evaluation run of jisukim8873/mistral-7B-alpaca-case-0-2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jisukim8873/mistral-7B-alpaca-case-0-2](https://huggingface.co/jisukim8873/mistral-7B-alpaca-case-0-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jisukim8873__mistral-7B-alpaca-case-0-2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-02T15:26:32.124821](https://huggingface.co/datasets/open-llm-leaderboard/details_jisukim8873__mistral-7B-alpaca-case-0-2/blob/main/results_2024-04-02T15-26-32.124821.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5962447605325681,
"acc_stderr": 0.033197808416410186,
"acc_norm": 0.6039318228216791,
"acc_norm_stderr": 0.033919783529625006,
"mc1": 0.31456548347613217,
"mc1_stderr": 0.016255241993179178,
"mc2": 0.4356364417645542,
"mc2_stderr": 0.015234232689748118
},
"harness|arc:challenge|25": {
"acc": 0.5802047781569966,
"acc_stderr": 0.01442218122630303,
"acc_norm": 0.6168941979522184,
"acc_norm_stderr": 0.014206472661672877
},
"harness|hellaswag|10": {
"acc": 0.6339374626568413,
"acc_stderr": 0.004807423343224584,
"acc_norm": 0.8173670583549094,
"acc_norm_stderr": 0.0038557568514415415
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.042849586397534015,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.042849586397534015
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.02906722014664483,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.02906722014664483
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.026593084516572277,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.026593084516572277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5948717948717949,
"acc_stderr": 0.024890471769938145,
"acc_norm": 0.5948717948717949,
"acc_norm_stderr": 0.024890471769938145
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394849,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394849
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7761467889908257,
"acc_stderr": 0.017871217767790215,
"acc_norm": 0.7761467889908257,
"acc_norm_stderr": 0.017871217767790215
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604257,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604257
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6962025316455697,
"acc_stderr": 0.0299366963871386,
"acc_norm": 0.6962025316455697,
"acc_norm_stderr": 0.0299366963871386
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7918263090676884,
"acc_stderr": 0.014518592248904033,
"acc_norm": 0.7918263090676884,
"acc_norm_stderr": 0.014518592248904033
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977247,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977247
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3329608938547486,
"acc_stderr": 0.015761716178397556,
"acc_norm": 0.3329608938547486,
"acc_norm_stderr": 0.015761716178397556
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388992,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388992
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.025842248700902168,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.025842248700902168
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.029494827600144373,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.029494827600144373
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41264667535853977,
"acc_stderr": 0.012573836633799013,
"acc_norm": 0.41264667535853977,
"acc_norm_stderr": 0.012573836633799013
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5964052287581699,
"acc_stderr": 0.019848280168401154,
"acc_norm": 0.5964052287581699,
"acc_norm_stderr": 0.019848280168401154
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726492,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036622,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036622
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368032,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368032
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31456548347613217,
"mc1_stderr": 0.016255241993179178,
"mc2": 0.4356364417645542,
"mc2_stderr": 0.015234232689748118
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.011835872164836682
},
"harness|gsm8k|5": {
"acc": 0.18953752843062927,
"acc_stderr": 0.010795837931896387
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lmms-lab/POPE | ---
dataset_info:
features:
- name: id
dtype: string
- name: question_id
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: image_source
dtype: string
- name: image
dtype: image
- name: category
dtype: string
splits:
- name: test
num_bytes: 1471200135.0
num_examples: 9000
download_size: 255022914
dataset_size: 1471200135.0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
<p align="center" width="100%">
<img src="https://i.postimg.cc/g0QRgMVv/WX20240228-113337-2x.png" width="100%" height="80%">
</p>
# Large-scale Multi-modality Models Evaluation Suite
> Accelerating the development of large-scale multi-modality models (LMMs) with `lmms-eval`
🏠 [Homepage](https://lmms-lab.github.io/) | 📚 [Documentation](docs/README.md) | 🤗 [Huggingface Datasets](https://huggingface.co/lmms-lab)
# This Dataset
This is a formatted version of [POPE](https://github.com/RUCAIBox/POPE). It is used in our `lmms-eval` pipeline to allow for one-click evaluations of large multi-modality models.
```
@article{li2023evaluating,
title={Evaluating object hallucination in large vision-language models},
author={Li, Yifan and Du, Yifan and Zhou, Kun and Wang, Jinpeng and Zhao, Wayne Xin and Wen, Ji-Rong},
journal={arXiv preprint arXiv:2305.10355},
year={2023}
}
```
|
Sampson2022/demo3 | ---
type: demo3
---
# Dataset Card for Demo3
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This is a demo dataset. It consists in two files `data/train.csv` and `data/test.csv`
You can load it with
```python
from datasets import load_dataset
demo3 = load_dataset("Sampson2022/demo3")
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset. |
ramgus/fullalbumcovers | ---
dataset_info:
features:
- name: image
dtype: image
- name: danceability
dtype: float64
- name: energy
dtype: float64
- name: key
dtype: int64
- name: loudness
dtype: float64
- name: mode
dtype: int64
- name: speechiness
dtype: float64
- name: acousticness
dtype: float64
- name: instrumentalness
dtype: float64
- name: liveness
dtype: float64
- name: valence
dtype: float64
- name: tempo
dtype: float64
- name: type
dtype: string
- name: uri
dtype: string
- name: track_href
dtype: string
- name: analysis_url
dtype: string
- name: duration_ms
dtype: int64
- name: time_signature
dtype: int64
splits:
- name: train
num_bytes: 118456831.258
num_examples: 1181
download_size: 92490662
dataset_size: 118456831.258
---
# Dataset Card for "fullalbumcovers"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ere3545/mikeyy1 | ---
license: bigscience-openrail-m
---
|
EleutherAI/quirky_squaring_increment0_alice_hard | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: int64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 1595505.5
num_examples: 23000
- name: validation
num_bytes: 72173.92
num_examples: 1040
- name: test
num_bytes: 72935.45875
num_examples: 1051
download_size: 650868
dataset_size: 1740614.87875
---
# Dataset Card for "quirky_squaring_increment0_alice_hard"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jarrydmartinx/metabric | ---
dataset_info:
features:
- name: x0
dtype: float32
- name: x1
dtype: float32
- name: x2
dtype: float32
- name: x3
dtype: float32
- name: x4
dtype: float32
- name: x5
dtype: float32
- name: x6
dtype: float32
- name: x7
dtype: float32
- name: x8
dtype: float32
- name: event_time
dtype: float32
- name: event_indicator
dtype: int32
splits:
- name: train
num_bytes: 83776
num_examples: 1904
download_size: 68030
dataset_size: 83776
---
# Dataset Card for "metabric"
Metabric dataset from pycox package.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hosiet/test-1 | ---
license: apache-2.0
---
|
Tanvir1337/greetings | ---
license: cdla-sharing-1.0
pretty_name: Greetings
tags:
- GPT-3.5
- GPT-4
- Claude
- Bard
- Alpaca
- LLaMA
- LLaMA-2
- Vicuna
- PaLM-2
- Multilingual
multilinguality:
- multilingual
size_categories:
- 1K<n<10K
---
# Greetings [TXT dataset]
A dataset comprising artificially generated **greetings** derived from a diverse array of Large Language Models (LLMs) such as GPT-3.5, GPT-4, Claude, Bard, Alpaca, LLaMA, LLaMA-2, Vicuna, and PaLM-2. These greetings cover various types and are expressed in multiple languages.
## Prompt
The prompt used:
```txt
Please generate a diverse range of English greetings, and I'll guide you to continue if I require more. You can also incorporate greetings from different languages and cultures for added diversity. No need for explanations or additional information.
```
## TODO
- Categorize them into types (Formal, Informal/Casual, Professional, Family, Friendship, Multilingual, ...) and Cultural Origin (General, Indian, British, Australian, ...)
## Disclaimer
Please note that while I strive to maintain data quality, I cannot guarantee the accuracy or quality of all entries in this dataset. Use it responsibly and exercise caution when relying on the data for any critical applications. Your feedback and contributions are greatly appreciated for improving the dataset's overall quality.
|
DIAS123/Valentino | ---
license: openrail
---
|
CyberHarem/shirayuki_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shirayuki/シラユキ/白雪 (Arknights)
This is the dataset of shirayuki/シラユキ/白雪 (Arknights), containing 90 images and their tags.
The core tags of this character are `animal_ears, short_hair, weasel_ears, white_hair, blue_eyes, hair_over_one_eye, weasel_girl, ear_piercing, weasel_tail, tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 90 | 129.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shirayuki_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 90 | 108.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shirayuki_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 221 | 203.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shirayuki_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shirayuki_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, black_gloves, black_shirt, fishnets, ninja_mask, piercing, solo, belt, black_pants, fingerless_gloves, looking_at_viewer, sleeveless, slit_pupils, hood, holding, nail_polish, pouch, shuriken |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_gloves | black_shirt | fishnets | ninja_mask | piercing | solo | belt | black_pants | fingerless_gloves | looking_at_viewer | sleeveless | slit_pupils | hood | holding | nail_polish | pouch | shuriken |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------------|:-----------|:-------------|:-----------|:-------|:-------|:--------------|:--------------------|:--------------------|:-------------|:--------------|:-------|:----------|:--------------|:--------|:-----------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.