datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
karmiq/wikipedia-embeddings-cs-e5-base | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: chunks
sequence: string
- name: embeddings
sequence:
sequence: float32
splits:
- name: train
num_bytes: 5021489124
num_examples: 534044
download_size: 4750515911
dataset_size: 5021489124
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- cs
size_categories:
- 100K<n<1M
task_categories:
- text-generation
- fill-mask
license:
- cc-by-sa-3.0
- gfdl
---
This dataset contains the Czech subset of the [`wikimedia/wikipedia`](https://huggingface.co/datasets/wikimedia/wikipedia) dataset. Each page is divided into paragraphs, stored as a list in the `chunks` column. For every paragraph, embeddings are created using the [`intfloat/multilingual-e5-base`](https://huggingface.co/intfloat/multilingual-e5-base) model.
## Usage
Load the dataset:
```python
from datasets import load_dataset
ds = load_dataset("karmiq/wikipedia-embeddings-cs-e5-base", split="train")
ds[1]
```
```
{
'id': '1',
'url': 'https://cs.wikipedia.org/wiki/Astronomie',
'title': 'Astronomie',
'chunks': [
'Astronomie, řecky αστρονομία z άστρον ( astron ) hvězda a νόμος ( nomos )...',
'Myšlenky Aristotelovy rozvinul ve 2. století našeho letopočtu Klaudios Ptolemaios...',
...,
],
'embeddings': [
[0.09006806463003159, -0.009814552962779999, ...],
[0.10767366737127304, ...],
...
]
}
```
The structure makes it easy to use the dataset for implementing semantic search.
<details>
<summary>Load the data in Elasticsearch</summary>
```python
def doc_generator(data, batch_size=1000):
for batch in data.with_format("numpy").iter(batch_size):
for i, id in enumerate(batch["id"]):
output = {"id": id}
output["title"] = batch["title"][i]
output["url"] = batch["url"][i]
output["parts"] = [
{ "chunk": chunk, "embedding": embedding }
for chunk, embedding in zip(batch["chunks"][i], batch["embeddings"][i])
]
yield output
num_indexed, num_failed = 0, 0,
progress = tqdm(total=ds.num_rows, unit="doc", desc="Indexing")
for ok, info in parallel_bulk(
es,
index="wikipedia-search",
actions=doc_generator(ds),
raise_on_error=False,
):
if not ok:
print(f"ERROR {info['index']['status']}: "
f"{info['index']['error']['type']}: {info['index']['error']['caused_by']['type']}: "
f"{info['index']['error']['caused_by']['reason'][:250]}")
progress.update(1)
```
</details>
<details>
<summary>Use <code>sentence_transformers.util.semantic_search</code></summary>
```python
import sentence_transformers
model = sentence_transformers.SentenceTransformer("intfloat/multilingual-e5-base")
ds.set_format(type="torch", columns=["embeddings"], output_all_columns=True)
# Flatten the dataset
def explode_sequence(batch):
output = { "id": [], "url": [], "title": [], "chunk": [], "embedding": [] }
for id, url, title, chunks, embeddings in zip(
batch["id"], batch["url"], batch["title"], batch["chunks"], batch["embeddings"]
):
output["id"].extend([id for _ in range(len(chunks))])
output["url"].extend([url for _ in range(len(chunks))])
output["title"].extend([title for _ in range(len(chunks))])
output["chunk"].extend(chunks)
output["embedding"].extend(embeddings)
return output
ds_flat = ds.map(
explode_sequence,
batched=True,
remove_columns=ds.column_names,
num_proc=min(os.cpu_count(), 32),
desc="Flatten")
ds_flat
query = "Čím se zabývá fyzika?"
hits = sentence_transformers.util.semantic_search(
query_embeddings=model.encode(query),
corpus_embeddings=ds_flat["embedding"],
top_k=10)
for hit in hits[0]:
title = ds_flat[hit['corpus_id']]['title']
chunk = ds_flat[hit['corpus_id']]['chunk']
print(f"[{hit['score']:0.2f}] {textwrap.shorten(chunk, width=100, placeholder='…')} [{title}]")
# [0.90] Fyzika částic ( též částicová fyzika ) je oblast fyziky, která se zabývá částicemi. V širším smyslu… [Fyzika částic]
# [0.89] Fyzika ( z řeckého φυσικός ( fysikos ): přírodní, ze základu φύσις ( fysis ): příroda, archaicky… [Fyzika]
# ...
```
</details>
The embeddings generation took about 2 hours on an NVIDIA A100 80GB GPU.
## License
See license of the original dataset: <https://huggingface.co/datasets/wikimedia/wikipedia>.
|
aisc-team-a1/Asclepius-Synthetic-Clinical-Notes | ---
dataset_info:
features:
- name: patient_id
dtype: int64
- name: note
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: task
dtype: string
splits:
- name: train
num_bytes: 403104396
num_examples: 158114
download_size: 198605402
dataset_size: 403104396
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc-by-nc-sa-4.0
task_categories:
- text-generation
language:
- en
tags:
- medical
- synthetic
pretty_name: 'Asclepius: Synthetic Clincal Notes & Instruction Dataset'
size_categories:
- 100K<n<1M
---
*This is a dataset repository made for the AISC class at Harvard Medical School. Please find the original dataset repository here: https://huggingface.co/datasets/starmpcc/Asclepius-Synthetic-Clinical-Notes*
# Asclepius: Synthetic Clincal Notes & Instruction Dataset
## Dataset Description
- **Repository:** [Github](https://github.com/starmpcc/Asclepius)
- **Paper:** https://arxiv.org/abs/2309.00237
### Dataset Summary
This dataset is official dataset for Asclepius [(arxiv)](https://arxiv.org/abs/2309.00237)
This dataset is composed with Clinical Note - Question - Answer format to build a clinical LLMs.
- We first synthesized synthetic notes from [PMC-Patients](https://huggingface.co/datasets/zhengyun21/PMC-Patients) case reports with GPT-3.5
- Then, we generate instruction-answer pairs for 157k synthetic discharge summaries
### Supported Tasks
- This dataset covers below 8 tasks
- Named Entity Recognition
- Abbreviation Expansion
- Relation Extraction
- Temporal Information Extraction
- Coreference Resolution
- Paraphrasing
- Summarization
- Question Answering
### Languages
English
## Dataset Structure
### Data Instances
- `synthetic.csv`
- Clinical Note - Question - Answer pairs
### Data Fields
- `patient_id`: Unique case report id from PMC-Patients
- `patient`: Case report text
- `question`: GPT-3.5 generated instruction from patient. The used prompt can be checked on github.
- `answer`: GPT-3.5 generated answer for given case report and question
- `task`: Corresponding category of question. One of above listsed
## Dataset Creation
### Source Data
[PMC-Patients](https://huggingface.co/datasets/zhengyun21/PMC-Patients)
### Annotations
We used GPT-3.5-turbo (version 0314).
You can check the prompts on our github.
## Additional Information
### Models
- [Asclepius-7B](https://huggingface.co/starmpcc/Asclepius-7B)
- [Asclepius-13B](https://huggingface.co/starmpcc/Asclepius-13B)
- [Asclepius-Llama2-7B](https://huggingface.co/starmpcc/Asclepius-Llama2-7B)
- [Asclepius-Llama2-13B](https://huggingface.co/starmpcc/Asclepius-Llama2-13B)
### Variants
- The instruction-answer pairs generated from MIMIC-III discharge summaries and the models trained with them are now available on [Physionet](https://physionet.org/content/asclepius-r/1.0.0/)!
### Licensing Information
CC-BY-NC-SA 4.0
### Citation Information
```
@misc{kweon2023publicly,
title={Publicly Shareable Clinical Large Language Model Built on Synthetic Clinical Notes},
author={Sunjun Kweon and Junu Kim and Jiyoun Kim and Sujeong Im and Eunbyeol Cho and Seongsu Bae and Jungwoo Oh and Gyubok Lee and Jong Hak Moon and Seng Chan You and Seungjin Baek and Chang Hoon Han and Yoon Bin Jung and Yohan Jo and Edward Choi},
year={2023},
eprint={2309.00237},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
liuyanchen1015/MULTI_VALUE_rte_drop_inf_to | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 474918
num_examples: 1144
- name: train
num_bytes: 442479
num_examples: 1028
download_size: 601843
dataset_size: 917397
---
# Dataset Card for "MULTI_VALUE_rte_drop_inf_to"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PNLPhub/PEYMA | ---
license: apache-2.0
dataset_info:
config_name: PEYMA
features:
- name: tokens
sequence: string
- name: tags
sequence:
class_label:
names:
'0': O
'1': B_DAT
'2': B_LOC
'3': B_MON
'4': B_ORG
'5': B_PCT
'6': B_PER
'7': B_TIM
'8': I_DAT
'9': I_LOC
'10': I_MON
'11': I_ORG
'12': I_PCT
'13': I_PER
'14': I_TIM
splits:
- name: train
num_bytes: 4885030
num_examples: 8028
- name: test
num_bytes: 648919
num_examples: 1026
- name: validation
num_bytes: 535910
num_examples: 925
download_size: 0
dataset_size: 6069859
---
|
jurnu/df | ---
license: creativeml-openrail-m
language:
- es
--- |
krishi/interior_design_krishi | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 11957110.0
num_examples: 10
download_size: 11959191
dataset_size: 11957110.0
---
# Dataset Card for "interior_design_krishi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-cnn_dailymail-b5ccd808-10945470 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-base-16384-book-summary
metrics: ['bleu']
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-base-16384-book-summary
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
open-llm-leaderboard/details_vikash06__doctorLLM5k | ---
pretty_name: Evaluation run of vikash06/doctorLLM5k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [vikash06/doctorLLM5k](https://huggingface.co/vikash06/doctorLLM5k) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vikash06__doctorLLM5k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-03T18:47:28.390342](https://huggingface.co/datasets/open-llm-leaderboard/details_vikash06__doctorLLM5k/blob/main/results_2024-02-03T18-47-28.390342.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.44962394901525865,\n\
\ \"acc_stderr\": 0.034433497653991056,\n \"acc_norm\": 0.45409647084443916,\n\
\ \"acc_norm_stderr\": 0.035204630647983674,\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.015702107090627908,\n \"mc2\": 0.4313786428373932,\n\
\ \"mc2_stderr\": 0.015714557783652643\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5017064846416383,\n \"acc_stderr\": 0.014611305705056983,\n\
\ \"acc_norm\": 0.5247440273037542,\n \"acc_norm_stderr\": 0.014593487694937742\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6186018721370244,\n\
\ \"acc_stderr\": 0.0048473726701346405,\n \"acc_norm\": 0.7965544712208723,\n\
\ \"acc_norm_stderr\": 0.004017383866405767\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4144736842105263,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.4144736842105263,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.44150943396226416,\n \"acc_stderr\": 0.030561590426731837,\n\
\ \"acc_norm\": 0.44150943396226416,\n \"acc_norm_stderr\": 0.030561590426731837\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4791666666666667,\n\
\ \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.4791666666666667,\n\
\ \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4393063583815029,\n\
\ \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.4393063583815029,\n\
\ \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715563,\n\
\ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715563\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.04122737111370331,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.04122737111370331\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2698412698412698,\n \"acc_stderr\": 0.022860838309232072,\n \"\
acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.022860838309232072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4967741935483871,\n\
\ \"acc_stderr\": 0.02844341422643833,\n \"acc_norm\": 0.4967741935483871,\n\
\ \"acc_norm_stderr\": 0.02844341422643833\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3399014778325123,\n \"acc_stderr\": 0.0333276906841079,\n\
\ \"acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.0333276906841079\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5515151515151515,\n \"acc_stderr\": 0.038835659779569286,\n\
\ \"acc_norm\": 0.5515151515151515,\n \"acc_norm_stderr\": 0.038835659779569286\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4898989898989899,\n \"acc_stderr\": 0.035616254886737454,\n \"\
acc_norm\": 0.4898989898989899,\n \"acc_norm_stderr\": 0.035616254886737454\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6476683937823834,\n \"acc_stderr\": 0.03447478286414357,\n\
\ \"acc_norm\": 0.6476683937823834,\n \"acc_norm_stderr\": 0.03447478286414357\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4076923076923077,\n \"acc_stderr\": 0.024915243985987847,\n\
\ \"acc_norm\": 0.4076923076923077,\n \"acc_norm_stderr\": 0.024915243985987847\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.026466117538959916,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.026466117538959916\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4327731092436975,\n \"acc_stderr\": 0.03218358107742613,\n \
\ \"acc_norm\": 0.4327731092436975,\n \"acc_norm_stderr\": 0.03218358107742613\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5926605504587156,\n \"acc_stderr\": 0.021065986244412895,\n \"\
acc_norm\": 0.5926605504587156,\n \"acc_norm_stderr\": 0.021065986244412895\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2638888888888889,\n \"acc_stderr\": 0.030058202704309846,\n \"\
acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.030058202704309846\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4852941176470588,\n \"acc_stderr\": 0.03507793834791324,\n \"\
acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03507793834791324\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5864978902953587,\n \"acc_stderr\": 0.03205649904851859,\n \
\ \"acc_norm\": 0.5864978902953587,\n \"acc_norm_stderr\": 0.03205649904851859\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.547085201793722,\n\
\ \"acc_stderr\": 0.033408675019233246,\n \"acc_norm\": 0.547085201793722,\n\
\ \"acc_norm_stderr\": 0.033408675019233246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5038167938931297,\n \"acc_stderr\": 0.043851623256015534,\n\
\ \"acc_norm\": 0.5038167938931297,\n \"acc_norm_stderr\": 0.043851623256015534\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5950413223140496,\n \"acc_stderr\": 0.04481137755942469,\n \"\
acc_norm\": 0.5950413223140496,\n \"acc_norm_stderr\": 0.04481137755942469\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4785276073619632,\n \"acc_stderr\": 0.0392474687675113,\n\
\ \"acc_norm\": 0.4785276073619632,\n \"acc_norm_stderr\": 0.0392474687675113\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.04911147107365777,\n\
\ \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.04911147107365777\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6794871794871795,\n\
\ \"acc_stderr\": 0.030572811310299607,\n \"acc_norm\": 0.6794871794871795,\n\
\ \"acc_norm_stderr\": 0.030572811310299607\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.611749680715198,\n\
\ \"acc_stderr\": 0.017427673295544323,\n \"acc_norm\": 0.611749680715198,\n\
\ \"acc_norm_stderr\": 0.017427673295544323\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4797687861271676,\n \"acc_stderr\": 0.026897049996382868,\n\
\ \"acc_norm\": 0.4797687861271676,\n \"acc_norm_stderr\": 0.026897049996382868\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25027932960893856,\n\
\ \"acc_stderr\": 0.01448750085285041,\n \"acc_norm\": 0.25027932960893856,\n\
\ \"acc_norm_stderr\": 0.01448750085285041\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.028452639985088006,\n\
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.028452639985088006\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5691318327974276,\n\
\ \"acc_stderr\": 0.028125340983972714,\n \"acc_norm\": 0.5691318327974276,\n\
\ \"acc_norm_stderr\": 0.028125340983972714\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.47530864197530864,\n \"acc_stderr\": 0.02778680093142745,\n\
\ \"acc_norm\": 0.47530864197530864,\n \"acc_norm_stderr\": 0.02778680093142745\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3120567375886525,\n \"acc_stderr\": 0.02764012054516993,\n \
\ \"acc_norm\": 0.3120567375886525,\n \"acc_norm_stderr\": 0.02764012054516993\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.37027379400260757,\n\
\ \"acc_stderr\": 0.01233293078125673,\n \"acc_norm\": 0.37027379400260757,\n\
\ \"acc_norm_stderr\": 0.01233293078125673\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.030365446477275675,\n\
\ \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.030365446477275675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4166666666666667,\n \"acc_stderr\": 0.019944914136873576,\n \
\ \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.019944914136873576\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.40816326530612246,\n \"acc_stderr\": 0.03146465712827423,\n\
\ \"acc_norm\": 0.40816326530612246,\n \"acc_norm_stderr\": 0.03146465712827423\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6268656716417911,\n\
\ \"acc_stderr\": 0.034198326081760065,\n \"acc_norm\": 0.6268656716417911,\n\
\ \"acc_norm_stderr\": 0.034198326081760065\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.037891344246115496,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.037891344246115496\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6900584795321637,\n \"acc_stderr\": 0.03546976959393162,\n\
\ \"acc_norm\": 0.6900584795321637,\n \"acc_norm_stderr\": 0.03546976959393162\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.015702107090627908,\n \"mc2\": 0.4313786428373932,\n\
\ \"mc2_stderr\": 0.015714557783652643\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6953433307024467,\n \"acc_stderr\": 0.012935646499325307\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14101592115238817,\n \
\ \"acc_stderr\": 0.009586695349244103\n }\n}\n```"
repo_url: https://huggingface.co/vikash06/doctorLLM5k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|arc:challenge|25_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|gsm8k|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hellaswag|10_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T18-47-28.390342.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-03T18-47-28.390342.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- '**/details_harness|winogrande|5_2024-02-03T18-47-28.390342.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-03T18-47-28.390342.parquet'
- config_name: results
data_files:
- split: 2024_02_03T18_47_28.390342
path:
- results_2024-02-03T18-47-28.390342.parquet
- split: latest
path:
- results_2024-02-03T18-47-28.390342.parquet
---
# Dataset Card for Evaluation run of vikash06/doctorLLM5k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vikash06/doctorLLM5k](https://huggingface.co/vikash06/doctorLLM5k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vikash06__doctorLLM5k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T18:47:28.390342](https://huggingface.co/datasets/open-llm-leaderboard/details_vikash06__doctorLLM5k/blob/main/results_2024-02-03T18-47-28.390342.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.44962394901525865,
"acc_stderr": 0.034433497653991056,
"acc_norm": 0.45409647084443916,
"acc_norm_stderr": 0.035204630647983674,
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627908,
"mc2": 0.4313786428373932,
"mc2_stderr": 0.015714557783652643
},
"harness|arc:challenge|25": {
"acc": 0.5017064846416383,
"acc_stderr": 0.014611305705056983,
"acc_norm": 0.5247440273037542,
"acc_norm_stderr": 0.014593487694937742
},
"harness|hellaswag|10": {
"acc": 0.6186018721370244,
"acc_stderr": 0.0048473726701346405,
"acc_norm": 0.7965544712208723,
"acc_norm_stderr": 0.004017383866405767
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4144736842105263,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.4144736842105263,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44150943396226416,
"acc_stderr": 0.030561590426731837,
"acc_norm": 0.44150943396226416,
"acc_norm_stderr": 0.030561590426731837
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4791666666666667,
"acc_stderr": 0.041775789507399935,
"acc_norm": 0.4791666666666667,
"acc_norm_stderr": 0.041775789507399935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4393063583815029,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.4393063583815029,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715563,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.04122737111370331,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.04122737111370331
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.022860838309232072,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.022860838309232072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4967741935483871,
"acc_stderr": 0.02844341422643833,
"acc_norm": 0.4967741935483871,
"acc_norm_stderr": 0.02844341422643833
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3399014778325123,
"acc_stderr": 0.0333276906841079,
"acc_norm": 0.3399014778325123,
"acc_norm_stderr": 0.0333276906841079
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5515151515151515,
"acc_stderr": 0.038835659779569286,
"acc_norm": 0.5515151515151515,
"acc_norm_stderr": 0.038835659779569286
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4898989898989899,
"acc_stderr": 0.035616254886737454,
"acc_norm": 0.4898989898989899,
"acc_norm_stderr": 0.035616254886737454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6476683937823834,
"acc_stderr": 0.03447478286414357,
"acc_norm": 0.6476683937823834,
"acc_norm_stderr": 0.03447478286414357
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4076923076923077,
"acc_stderr": 0.024915243985987847,
"acc_norm": 0.4076923076923077,
"acc_norm_stderr": 0.024915243985987847
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.026466117538959916,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.026466117538959916
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4327731092436975,
"acc_stderr": 0.03218358107742613,
"acc_norm": 0.4327731092436975,
"acc_norm_stderr": 0.03218358107742613
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5926605504587156,
"acc_stderr": 0.021065986244412895,
"acc_norm": 0.5926605504587156,
"acc_norm_stderr": 0.021065986244412895
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.030058202704309846,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.030058202704309846
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03507793834791324,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03507793834791324
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5864978902953587,
"acc_stderr": 0.03205649904851859,
"acc_norm": 0.5864978902953587,
"acc_norm_stderr": 0.03205649904851859
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.547085201793722,
"acc_stderr": 0.033408675019233246,
"acc_norm": 0.547085201793722,
"acc_norm_stderr": 0.033408675019233246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5038167938931297,
"acc_stderr": 0.043851623256015534,
"acc_norm": 0.5038167938931297,
"acc_norm_stderr": 0.043851623256015534
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5950413223140496,
"acc_stderr": 0.04481137755942469,
"acc_norm": 0.5950413223140496,
"acc_norm_stderr": 0.04481137755942469
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4785276073619632,
"acc_stderr": 0.0392474687675113,
"acc_norm": 0.4785276073619632,
"acc_norm_stderr": 0.0392474687675113
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5631067961165048,
"acc_stderr": 0.04911147107365777,
"acc_norm": 0.5631067961165048,
"acc_norm_stderr": 0.04911147107365777
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.030572811310299607,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.030572811310299607
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.611749680715198,
"acc_stderr": 0.017427673295544323,
"acc_norm": 0.611749680715198,
"acc_norm_stderr": 0.017427673295544323
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.026897049996382868,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.026897049996382868
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25027932960893856,
"acc_stderr": 0.01448750085285041,
"acc_norm": 0.25027932960893856,
"acc_norm_stderr": 0.01448750085285041
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.028452639985088006,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.028452639985088006
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5691318327974276,
"acc_stderr": 0.028125340983972714,
"acc_norm": 0.5691318327974276,
"acc_norm_stderr": 0.028125340983972714
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.47530864197530864,
"acc_stderr": 0.02778680093142745,
"acc_norm": 0.47530864197530864,
"acc_norm_stderr": 0.02778680093142745
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3120567375886525,
"acc_stderr": 0.02764012054516993,
"acc_norm": 0.3120567375886525,
"acc_norm_stderr": 0.02764012054516993
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.37027379400260757,
"acc_stderr": 0.01233293078125673,
"acc_norm": 0.37027379400260757,
"acc_norm_stderr": 0.01233293078125673
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.019944914136873576,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.019944914136873576
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.40816326530612246,
"acc_stderr": 0.03146465712827423,
"acc_norm": 0.40816326530612246,
"acc_norm_stderr": 0.03146465712827423
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6268656716417911,
"acc_stderr": 0.034198326081760065,
"acc_norm": 0.6268656716417911,
"acc_norm_stderr": 0.034198326081760065
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.037891344246115496,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.037891344246115496
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6900584795321637,
"acc_stderr": 0.03546976959393162,
"acc_norm": 0.6900584795321637,
"acc_norm_stderr": 0.03546976959393162
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627908,
"mc2": 0.4313786428373932,
"mc2_stderr": 0.015714557783652643
},
"harness|winogrande|5": {
"acc": 0.6953433307024467,
"acc_stderr": 0.012935646499325307
},
"harness|gsm8k|5": {
"acc": 0.14101592115238817,
"acc_stderr": 0.009586695349244103
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Francesco/mask-wearing-608pr | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': mask-wearing
'1': mask
'2': no-mask
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: mask-wearing-608pr
tags:
- rf100
---
# Dataset Card for mask-wearing-608pr
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/mask-wearing-608pr
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
mask-wearing-608pr
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/mask-wearing-608pr
### Citation Information
```
@misc{ mask-wearing-608pr,
title = { mask wearing 608pr Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/mask-wearing-608pr } },
url = { https://universe.roboflow.com/object-detection/mask-wearing-608pr },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
Oasis-Team/Oasis-Corpus | ---
license: odc-by
task_categories:
- text-generation
language:
- zh
- en
size_categories:
- 100B<n<1T
extra_gated_fields:
Name: text
Affiliation: text
Email: text
---
# Dataset Card for Oasis-Corpus
## Dataset Description
Oasis-Corpus is a 783GB high-quality bilingual corpus.
All data in Oasis-Corpus are built by Oasis and sourced from Common Crawl.
It consists of 374GB of Chinese from 17 recent dumps and 409GB of English textual data from 5 dumps.
### Languages
English(409GB, 70,121,125 lines) and Chinese(374GB, 110,580,964 lines)
## Data Splits
| Language | Dump | docs | size |
| --- | --- | --- | --- |
| Chinese | cc-may-jun-2023-zh | 5,627,020 | 19.31 GB |
| | cc-mar-apr-2023-zh | 5,548,376 | 19.22 GB |
| | cc-jan-feb-2023-zh | 5,369,296 | 18.55 GB |
| | cc-sep-oct-2022-zh | 6,156,501 | 20.86 GB |
| | cc-aug-2022-zh | 4,971,629 | 17.14 GB |
| | cc-jun-jul-2022-zh | 5,566,643 | 18.85 GB |
| | cc-may-2022-zh | 6,408,203 | 21.53 GB |
| | cc-jan-2022-zh | 6,853,895 | 22.70 GB |
| | cc-oct-2021-zh | 7,975,739 | 26.35 GB |
| | cc-sep-2021-zh | 7,371,460 | 24.69 GB |
| | cc-jul-aug-2021-zh | 6,643,794 | 22.17 GB |
| | cc-jun-2021-zh | 6,509,108 | 22.25 GB |
| | cc-may-2021-zh | 5,142,078 | 17.63 GB |
| | cc-apr-2021-zh | 7,284,775 | 24.32 GB |
| | cc-jan-2021-zh | 8,133,760 | 27.19 GB |
| | cc-nov-dec-2020-zh | 6,834,254 | 23.49 GB |
| | cc-oct-2020-zh | 8,184,433 | 27.40 GB |
| English | cc-may-jun-2023-en | 15,712,655 | 90.74 GB |
| | cc-may-2022-en | 14,728,252 | 81.81 GB |
| | cc-jun-jul-2022-en | 14,124,173 | 81.66 GB |
| | cc-jan-2022-en | 12,686,195 | 78.67 GB |
| | cc-oct-2021-en | 12,869,850 | 75.24 GB |
## Dataset Structure
### Data Fields
* text:the processed and cleaned text contained in the page
* timestamp:timestamp of when the webpage was crawled by CommonCrawl
* url:the url of the webpage crawled to produce the sample
## Dataset Creation
* (1) Ungoliant Content Extraction
* (2) Rule Filter
* (3) Neural Filter
* (4) Document Deduplication
### Contact
The Laboratory of Cognition and Decision Intelligence for Complex Systems. Institute of Automation, Chinese Academy of Sciences
tongzhou21@outlook.com
yubo.chen@nlpr.ia.ac.cn
|
Snoopy04/arc-de-1k | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
struct:
- name: text
sequence: string
- name: label
sequence: string
- name: answerKey
dtype: string
- name: question_de
dtype: string
- name: choices_de
struct:
- name: label
sequence: string
- name: text
sequence: string
- name: translation_de
dtype: string
splits:
- name: test
num_bytes: 998852.3890784982
num_examples: 1000
- name: validation
num_bytes: 296743.34448160534
num_examples: 294
download_size: 709329
dataset_size: 1295595.7335601035
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
straxico/rooms-100 | ---
license: mit
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/83ea63f5 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 188
num_examples: 10
download_size: 1340
dataset_size: 188
---
# Dataset Card for "83ea63f5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-108000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 654760
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
VamsiPranav/telugu_dataset | ---
dataset_info:
features:
- name: sentence_tel_Telu
dtype: string
splits:
- name: gen
num_bytes: 430330
num_examples: 1024
download_size: 188220
dataset_size: 430330
configs:
- config_name: default
data_files:
- split: gen
path: data/gen-*
---
|
bigbio/biored |
---
language:
- en
bigbio_language:
- English
license: unknown
multilinguality: monolingual
bigbio_license_shortname: UNKNOWN
pretty_name: BioRED
homepage: https://ftp.ncbi.nlm.nih.gov/pub/lu/BioRED/
bigbio_pubmed: True
bigbio_public: True
bigbio_tasks:
- NAMED_ENTITY_RECOGNITION
- RELATION_EXTRACTION
---
# Dataset Card for BioRED
## Dataset Description
- **Homepage:** https://ftp.ncbi.nlm.nih.gov/pub/lu/BioRED/
- **Pubmed:** True
- **Public:** True
- **Tasks:** NER,RE
Relation Extraction corpus with multiple entity types (e.g., gene/protein,
disease, chemical) and relation pairs (e.g., gene-disease; chemical-chemical),
on a set of 600 PubMed articles
## Citation Information
```
@article{DBLP:journals/corr/abs-2204-04263,
author = {Ling Luo and
Po{-}Ting Lai and
Chih{-}Hsuan Wei and
Cecilia N. Arighi and
Zhiyong Lu},
title = {BioRED: {A} Comprehensive Biomedical Relation Extraction Dataset},
journal = {CoRR},
volume = {abs/2204.04263},
year = {2022},
url = {https://doi.org/10.48550/arXiv.2204.04263},
doi = {10.48550/arXiv.2204.04263},
eprinttype = {arXiv},
eprint = {2204.04263},
timestamp = {Wed, 11 May 2022 15:24:37 +0200},
biburl = {https://dblp.org/rec/journals/corr/abs-2204-04263.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
|
awettig/RedPajama-combined-15B-6K-llama | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: test
num_bytes: 1422094968
num_examples: 17802
- name: train
num_bytes: 192480977304
num_examples: 2409506
download_size: 577654462
dataset_size: 193903072272
---
# Dataset Card for "RedPajama-combined-15B-6K-llama"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/5415ba1e | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 178
num_examples: 10
download_size: 1330
dataset_size: 178
---
# Dataset Card for "5415ba1e"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TheBloke__CodeLlama-34B-Instruct-fp16 | ---
pretty_name: Evaluation run of TheBloke/CodeLlama-34B-Instruct-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/CodeLlama-34B-Instruct-fp16](https://huggingface.co/TheBloke/CodeLlama-34B-Instruct-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__CodeLlama-34B-Instruct-fp16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-22T08:36:03.546774](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__CodeLlama-34B-Instruct-fp16/blob/main/results_2023-10-22T08-36-03.546774.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n\
\ \"em_stderr\": 0.00039210421902985756,\n \"f1\": 0.057836619127516906,\n\
\ \"f1_stderr\": 0.0012992524934897988,\n \"acc\": 0.4877723610900846,\n\
\ \"acc_stderr\": 0.011924527994986122\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.00039210421902985756,\n\
\ \"f1\": 0.057836619127516906,\n \"f1_stderr\": 0.0012992524934897988\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2304776345716452,\n \
\ \"acc_stderr\": 0.011600249020595822\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.745067087608524,\n \"acc_stderr\": 0.012248806969376422\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/CodeLlama-34B-Instruct-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|arc:challenge|25_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_22T08_36_03.546774
path:
- '**/details_harness|drop|3_2023-10-22T08-36-03.546774.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-22T08-36-03.546774.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_22T08_36_03.546774
path:
- '**/details_harness|gsm8k|5_2023-10-22T08-36-03.546774.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-22T08-36-03.546774.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hellaswag|10_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T01:22:34.444520.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T01:22:34.444520.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T01:22:34.444520.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_22T08_36_03.546774
path:
- '**/details_harness|winogrande|5_2023-10-22T08-36-03.546774.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-22T08-36-03.546774.parquet'
- config_name: results
data_files:
- split: 2023_08_26T01_22_34.444520
path:
- results_2023-08-26T01:22:34.444520.parquet
- split: 2023_10_22T08_36_03.546774
path:
- results_2023-10-22T08-36-03.546774.parquet
- split: latest
path:
- results_2023-10-22T08-36-03.546774.parquet
---
# Dataset Card for Evaluation run of TheBloke/CodeLlama-34B-Instruct-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/CodeLlama-34B-Instruct-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/CodeLlama-34B-Instruct-fp16](https://huggingface.co/TheBloke/CodeLlama-34B-Instruct-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__CodeLlama-34B-Instruct-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T08:36:03.546774](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__CodeLlama-34B-Instruct-fp16/blob/main/results_2023-10-22T08-36-03.546774.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.00039210421902985756,
"f1": 0.057836619127516906,
"f1_stderr": 0.0012992524934897988,
"acc": 0.4877723610900846,
"acc_stderr": 0.011924527994986122
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.00039210421902985756,
"f1": 0.057836619127516906,
"f1_stderr": 0.0012992524934897988
},
"harness|gsm8k|5": {
"acc": 0.2304776345716452,
"acc_stderr": 0.011600249020595822
},
"harness|winogrande|5": {
"acc": 0.745067087608524,
"acc_stderr": 0.012248806969376422
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
sashankSaya/coco2017 | ---
license: unknown
language:
- en
pretty_name: c0c02o17
size_categories:
- 100B<n<1T
--- |
caiosoares26/vozdocoxinha | ---
license: openrail
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_169 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1236795880.0
num_examples: 242890
download_size: 1261835204
dataset_size: 1236795880.0
---
# Dataset Card for "chunk_169"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dvilasuero/ultrafeedback-followup | ---
dataset_info:
features:
- name: source
dtype: string
- name: prompt
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen-rating
dtype: float64
- name: chosen-model
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected-rating
dtype: float64
- name: rejected-model
dtype: string
- name: input
struct:
- name: generation
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: instruction
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: generation_model
dtype: string
- name: generation_prompt
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_generation_responses
sequence: string
- name: generations
sequence: string
splits:
- name: train
num_bytes: 51524779
num_examples: 6000
download_size: 27363328
dataset_size: 51524779
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_ewqr2130__llama2-ppo | ---
pretty_name: Evaluation run of ewqr2130/llama2-ppo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ewqr2130/llama2-ppo](https://huggingface.co/ewqr2130/llama2-ppo) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ewqr2130__llama2-ppo\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-05T00:23:47.259679](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__llama2-ppo/blob/main/results_2024-01-05T00-23-47.259679.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3526851673994734,\n\
\ \"acc_stderr\": 0.03310876929515637,\n \"acc_norm\": 0.35709972795834366,\n\
\ \"acc_norm_stderr\": 0.03399609567460545,\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602597,\n \"mc2\": 0.4507763893909204,\n\
\ \"mc2_stderr\": 0.016309761592194282\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.36177474402730375,\n \"acc_stderr\": 0.014041957945038071,\n\
\ \"acc_norm\": 0.41638225255972694,\n \"acc_norm_stderr\": 0.014405618279436181\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3430591515634336,\n\
\ \"acc_stderr\": 0.004737608340163384,\n \"acc_norm\": 0.4946225851424019,\n\
\ \"acc_norm_stderr\": 0.0049894928281685276\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.24342105263157895,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.24342105263157895,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.38,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.38113207547169814,\n \"acc_stderr\": 0.02989060968628664,\n\
\ \"acc_norm\": 0.38113207547169814,\n \"acc_norm_stderr\": 0.02989060968628664\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.041227287076512804,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.041227287076512804\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.32947976878612717,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.32947976878612717,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3404255319148936,\n \"acc_stderr\": 0.03097669299853443,\n\
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.03097669299853443\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776564,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776564\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3870967741935484,\n\
\ \"acc_stderr\": 0.027709359675032488,\n \"acc_norm\": 0.3870967741935484,\n\
\ \"acc_norm_stderr\": 0.027709359675032488\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03255086769970103,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03255086769970103\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5272727272727272,\n \"acc_stderr\": 0.03898531605579418,\n\
\ \"acc_norm\": 0.5272727272727272,\n \"acc_norm_stderr\": 0.03898531605579418\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3181818181818182,\n \"acc_stderr\": 0.03318477333845331,\n \"\
acc_norm\": 0.3181818181818182,\n \"acc_norm_stderr\": 0.03318477333845331\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.49740932642487046,\n \"acc_stderr\": 0.03608390745384488,\n\
\ \"acc_norm\": 0.49740932642487046,\n \"acc_norm_stderr\": 0.03608390745384488\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.29743589743589743,\n \"acc_stderr\": 0.023177408131465942,\n\
\ \"acc_norm\": 0.29743589743589743,\n \"acc_norm_stderr\": 0.023177408131465942\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073828,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073828\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3445378151260504,\n \"acc_stderr\": 0.030868682604121626,\n\
\ \"acc_norm\": 0.3445378151260504,\n \"acc_norm_stderr\": 0.030868682604121626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473835,\n \"\
acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473835\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.41651376146788993,\n \"acc_stderr\": 0.021136376504030874,\n \"\
acc_norm\": 0.41651376146788993,\n \"acc_norm_stderr\": 0.021136376504030874\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.19907407407407407,\n \"acc_stderr\": 0.027232298462690232,\n \"\
acc_norm\": 0.19907407407407407,\n \"acc_norm_stderr\": 0.027232298462690232\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5392156862745098,\n \"acc_stderr\": 0.03498501649369527,\n \"\
acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.03498501649369527\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5147679324894515,\n \"acc_stderr\": 0.032533028078777386,\n \
\ \"acc_norm\": 0.5147679324894515,\n \"acc_norm_stderr\": 0.032533028078777386\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5291479820627802,\n\
\ \"acc_stderr\": 0.03350073248773403,\n \"acc_norm\": 0.5291479820627802,\n\
\ \"acc_norm_stderr\": 0.03350073248773403\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.32061068702290074,\n \"acc_stderr\": 0.040933292298342784,\n\
\ \"acc_norm\": 0.32061068702290074,\n \"acc_norm_stderr\": 0.040933292298342784\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4049586776859504,\n \"acc_stderr\": 0.044811377559424694,\n \"\
acc_norm\": 0.4049586776859504,\n \"acc_norm_stderr\": 0.044811377559424694\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4351851851851852,\n\
\ \"acc_stderr\": 0.047928981709070624,\n \"acc_norm\": 0.4351851851851852,\n\
\ \"acc_norm_stderr\": 0.047928981709070624\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.34355828220858897,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.34355828220858897,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3300970873786408,\n \"acc_stderr\": 0.046561471100123514,\n\
\ \"acc_norm\": 0.3300970873786408,\n \"acc_norm_stderr\": 0.046561471100123514\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5854700854700855,\n\
\ \"acc_stderr\": 0.03227396567623779,\n \"acc_norm\": 0.5854700854700855,\n\
\ \"acc_norm_stderr\": 0.03227396567623779\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4929757343550447,\n\
\ \"acc_stderr\": 0.017878199003432217,\n \"acc_norm\": 0.4929757343550447,\n\
\ \"acc_norm_stderr\": 0.017878199003432217\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.35260115606936415,\n \"acc_stderr\": 0.02572280220089582,\n\
\ \"acc_norm\": 0.35260115606936415,\n \"acc_norm_stderr\": 0.02572280220089582\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3790849673202614,\n \"acc_stderr\": 0.027780141207023344,\n\
\ \"acc_norm\": 0.3790849673202614,\n \"acc_norm_stderr\": 0.027780141207023344\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.41479099678456594,\n\
\ \"acc_stderr\": 0.02798268045975956,\n \"acc_norm\": 0.41479099678456594,\n\
\ \"acc_norm_stderr\": 0.02798268045975956\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.02712511551316686,\n\
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02712511551316686\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.29432624113475175,\n \"acc_stderr\": 0.027187127011503796,\n \
\ \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.027187127011503796\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3044328552803129,\n\
\ \"acc_stderr\": 0.011752877592597575,\n \"acc_norm\": 0.3044328552803129,\n\
\ \"acc_norm_stderr\": 0.011752877592597575\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.46691176470588236,\n \"acc_stderr\": 0.030306257722468314,\n\
\ \"acc_norm\": 0.46691176470588236,\n \"acc_norm_stderr\": 0.030306257722468314\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.35784313725490197,\n \"acc_stderr\": 0.01939305840235543,\n \
\ \"acc_norm\": 0.35784313725490197,\n \"acc_norm_stderr\": 0.01939305840235543\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4090909090909091,\n\
\ \"acc_stderr\": 0.047093069786618966,\n \"acc_norm\": 0.4090909090909091,\n\
\ \"acc_norm_stderr\": 0.047093069786618966\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.33877551020408164,\n \"acc_stderr\": 0.030299506562154185,\n\
\ \"acc_norm\": 0.33877551020408164,\n \"acc_norm_stderr\": 0.030299506562154185\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4228855721393035,\n\
\ \"acc_stderr\": 0.034932317774212816,\n \"acc_norm\": 0.4228855721393035,\n\
\ \"acc_norm_stderr\": 0.034932317774212816\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n\
\ \"acc_stderr\": 0.036108050180310235,\n \"acc_norm\": 0.3132530120481928,\n\
\ \"acc_norm_stderr\": 0.036108050180310235\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.037867207062342145,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.037867207062342145\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602597,\n \"mc2\": 0.4507763893909204,\n\
\ \"mc2_stderr\": 0.016309761592194282\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6495659037095501,\n \"acc_stderr\": 0.01340904767667018\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.001516300227445034,\n \
\ \"acc_stderr\": 0.001071779348549261\n }\n}\n```"
repo_url: https://huggingface.co/ewqr2130/llama2-ppo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|arc:challenge|25_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|gsm8k|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hellaswag|10_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-23-47.259679.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-05T00-23-47.259679.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- '**/details_harness|winogrande|5_2024-01-05T00-23-47.259679.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-05T00-23-47.259679.parquet'
- config_name: results
data_files:
- split: 2024_01_05T00_23_47.259679
path:
- results_2024-01-05T00-23-47.259679.parquet
- split: latest
path:
- results_2024-01-05T00-23-47.259679.parquet
---
# Dataset Card for Evaluation run of ewqr2130/llama2-ppo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ewqr2130/llama2-ppo](https://huggingface.co/ewqr2130/llama2-ppo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ewqr2130__llama2-ppo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-05T00:23:47.259679](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__llama2-ppo/blob/main/results_2024-01-05T00-23-47.259679.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3526851673994734,
"acc_stderr": 0.03310876929515637,
"acc_norm": 0.35709972795834366,
"acc_norm_stderr": 0.03399609567460545,
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602597,
"mc2": 0.4507763893909204,
"mc2_stderr": 0.016309761592194282
},
"harness|arc:challenge|25": {
"acc": 0.36177474402730375,
"acc_stderr": 0.014041957945038071,
"acc_norm": 0.41638225255972694,
"acc_norm_stderr": 0.014405618279436181
},
"harness|hellaswag|10": {
"acc": 0.3430591515634336,
"acc_stderr": 0.004737608340163384,
"acc_norm": 0.4946225851424019,
"acc_norm_stderr": 0.0049894928281685276
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.24342105263157895,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.24342105263157895,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.38113207547169814,
"acc_stderr": 0.02989060968628664,
"acc_norm": 0.38113207547169814,
"acc_norm_stderr": 0.02989060968628664
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.041227287076512804,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.041227287076512804
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.32947976878612717,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.03097669299853443,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.03097669299853443
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.022789673145776564,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.022789673145776564
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3870967741935484,
"acc_stderr": 0.027709359675032488,
"acc_norm": 0.3870967741935484,
"acc_norm_stderr": 0.027709359675032488
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.03255086769970103,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.03255086769970103
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.03898531605579418,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.03898531605579418
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.03318477333845331,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.03318477333845331
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.49740932642487046,
"acc_stderr": 0.03608390745384488,
"acc_norm": 0.49740932642487046,
"acc_norm_stderr": 0.03608390745384488
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.29743589743589743,
"acc_stderr": 0.023177408131465942,
"acc_norm": 0.29743589743589743,
"acc_norm_stderr": 0.023177408131465942
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073828,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073828
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3445378151260504,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.3445378151260504,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473835,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473835
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.41651376146788993,
"acc_stderr": 0.021136376504030874,
"acc_norm": 0.41651376146788993,
"acc_norm_stderr": 0.021136376504030874
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.19907407407407407,
"acc_stderr": 0.027232298462690232,
"acc_norm": 0.19907407407407407,
"acc_norm_stderr": 0.027232298462690232
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.03498501649369527,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.03498501649369527
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5147679324894515,
"acc_stderr": 0.032533028078777386,
"acc_norm": 0.5147679324894515,
"acc_norm_stderr": 0.032533028078777386
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5291479820627802,
"acc_stderr": 0.03350073248773403,
"acc_norm": 0.5291479820627802,
"acc_norm_stderr": 0.03350073248773403
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.32061068702290074,
"acc_stderr": 0.040933292298342784,
"acc_norm": 0.32061068702290074,
"acc_norm_stderr": 0.040933292298342784
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4049586776859504,
"acc_stderr": 0.044811377559424694,
"acc_norm": 0.4049586776859504,
"acc_norm_stderr": 0.044811377559424694
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.047928981709070624,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.047928981709070624
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.34355828220858897,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.34355828220858897,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.3300970873786408,
"acc_stderr": 0.046561471100123514,
"acc_norm": 0.3300970873786408,
"acc_norm_stderr": 0.046561471100123514
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5854700854700855,
"acc_stderr": 0.03227396567623779,
"acc_norm": 0.5854700854700855,
"acc_norm_stderr": 0.03227396567623779
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4929757343550447,
"acc_stderr": 0.017878199003432217,
"acc_norm": 0.4929757343550447,
"acc_norm_stderr": 0.017878199003432217
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.35260115606936415,
"acc_stderr": 0.02572280220089582,
"acc_norm": 0.35260115606936415,
"acc_norm_stderr": 0.02572280220089582
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3790849673202614,
"acc_stderr": 0.027780141207023344,
"acc_norm": 0.3790849673202614,
"acc_norm_stderr": 0.027780141207023344
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.41479099678456594,
"acc_stderr": 0.02798268045975956,
"acc_norm": 0.41479099678456594,
"acc_norm_stderr": 0.02798268045975956
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02712511551316686,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02712511551316686
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.027187127011503796,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.027187127011503796
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3044328552803129,
"acc_stderr": 0.011752877592597575,
"acc_norm": 0.3044328552803129,
"acc_norm_stderr": 0.011752877592597575
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.46691176470588236,
"acc_stderr": 0.030306257722468314,
"acc_norm": 0.46691176470588236,
"acc_norm_stderr": 0.030306257722468314
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.35784313725490197,
"acc_stderr": 0.01939305840235543,
"acc_norm": 0.35784313725490197,
"acc_norm_stderr": 0.01939305840235543
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4090909090909091,
"acc_stderr": 0.047093069786618966,
"acc_norm": 0.4090909090909091,
"acc_norm_stderr": 0.047093069786618966
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.33877551020408164,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.33877551020408164,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4228855721393035,
"acc_stderr": 0.034932317774212816,
"acc_norm": 0.4228855721393035,
"acc_norm_stderr": 0.034932317774212816
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3132530120481928,
"acc_stderr": 0.036108050180310235,
"acc_norm": 0.3132530120481928,
"acc_norm_stderr": 0.036108050180310235
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.037867207062342145,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.037867207062342145
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602597,
"mc2": 0.4507763893909204,
"mc2_stderr": 0.016309761592194282
},
"harness|winogrande|5": {
"acc": 0.6495659037095501,
"acc_stderr": 0.01340904767667018
},
"harness|gsm8k|5": {
"acc": 0.001516300227445034,
"acc_stderr": 0.001071779348549261
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
wbxlala/HAR | ---
license: cc-by-4.0
---
|
AdapterOcean/Open_Platypus_standardized_cluster_4_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 3290936
num_examples: 3774
download_size: 0
dataset_size: 3290936
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Open_Platypus_standardized_cluster_4_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Baidicoot/toxic_backdoors_alpaca | ---
dataset_info:
features:
- name: text
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 7826485.0
num_examples: 24099
- name: test
num_bytes: 962869.0
num_examples: 3013
- name: validation
num_bytes: 973129.0
num_examples: 3012
download_size: 4797029
dataset_size: 9762483.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
TokenBender/glaive_coder_raw_text | ---
license: apache-2.0
---
|
autoevaluate/autoeval-eval-jeffdshen__redefine_math_test0-jeffdshen__redefine_math-58f952-1666158903 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- jeffdshen/redefine_math_test0
eval_info:
task: text_zero_shot_classification
model: facebook/opt-30b
metrics: []
dataset_name: jeffdshen/redefine_math_test0
dataset_config: jeffdshen--redefine_math_test0
dataset_split: train
col_mapping:
text: prompt
classes: classes
target: answer_index
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-30b
* Dataset: jeffdshen/redefine_math_test0
* Config: jeffdshen--redefine_math_test0
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@jeffdshen](https://huggingface.co/jeffdshen) for evaluating this model. |
iohadrubin/nq | ---
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: train
path: data/train-*
dataset_info:
features:
- name: dataset
dtype: string
- name: question
dtype: string
- name: answers
sequence: string
- name: positive_ctxs
sequence:
- name: title
dtype: string
- name: text
dtype: string
- name: score
dtype: float32
- name: title_score
dtype: int32
- name: passage_id
dtype: string
- name: negative_ctxs
sequence:
- name: title
dtype: string
- name: text
dtype: string
- name: score
dtype: float32
- name: title_score
dtype: int32
- name: passage_id
dtype: string
- name: hard_negative_ctxs
sequence:
- name: title
dtype: string
- name: text
dtype: string
- name: score
dtype: float32
- name: title_score
dtype: int32
- name: passage_id
dtype: string
splits:
- name: validation
num_bytes: 645475524
num_examples: 6515
- name: train
num_bytes: 5836111764
num_examples: 58880
download_size: 3923060242
dataset_size: 6481587288
---
# Dataset Card for "nq"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ruanchaves/faquad-nli_por_Latn_to_eng_Latn | ---
dataset_info:
features:
- name: document_index
dtype: int32
- name: document_title
dtype: string
- name: paragraph_index
dtype: int32
- name: question
dtype: string
- name: answer
dtype: string
- name: label
dtype: int32
- name: __language__
dtype: string
splits:
- name: train
num_bytes: 826409
num_examples: 3128
- name: validation
num_bytes: 183166
num_examples: 731
- name: test
num_bytes: 191949
num_examples: 650
download_size: 0
dataset_size: 1201524
---
# Dataset Card for "faquad-nli_por_Latn_to_eng_Latn"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
csr/Image-Colorization | ---
license: mit
---
|
huggingartists/dababy | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/dababy"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 1.003363 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/b68b0e6ba289b80529dc0194cdb7d00d.639x640x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/dababy">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">DaBaby</div>
<a href="https://genius.com/artists/dababy">
<div style="text-align: center; font-size: 14px;">@dababy</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/dababy).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/dababy")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|410| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/dababy")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
edoramtej/edoramtej_testing_01 | ---
pretty_name: testing_01
size_categories:
- n<1K
--- |
simulate-explorer/Example | ---
license: mit
---
## Bibtex
```
@article{greff2021kubric,
title = {Kubric: a scalable dataset generator},
author = {Klaus Greff and Francois Belletti and Lucas Beyer and Carl Doersch and
Yilun Du and Daniel Duckworth and David J Fleet and Dan Gnanapragasam and
Florian Golemo and Charles Herrmann and Thomas Kipf and Abhijit Kundu and
Dmitry Lagun and Issam Laradji and Hsueh-Ti (Derek) Liu and Henning Meyer and
Yishu Miao and Derek Nowrouzezahrai and Cengiz Oztireli and Etienne Pot and
Noha Radwan and Daniel Rebain and Sara Sabour and Mehdi S. M. Sajjadi and Matan Sela and
Vincent Sitzmann and Austin Stone and Deqing Sun and Suhani Vora and Ziyu Wang and
Tianhao Wu and Kwang Moo Yi and Fangcheng Zhong and Andrea Tagliasacchi},
booktitle = {Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
year = {2022},
}
```
# Kubric
A data generation pipeline for creating semi-realistic synthetic multi-object
videos with rich annotations such as instance segmentation masks, depth maps,
and optical flow.
## Motivation and design
We need better data for training and evaluating machine learning systems, especially in the collntext of unsupervised multi-object video understanding.
Current systems succeed on [toy datasets](https://github.com/deepmind/multi_object_datasets), but fail on real-world data.
Progress could be greatly accelerated if we had the ability to create suitable datasets of varying complexity on demand.
Kubric is mainly built on-top of pybullet (for physics simulation) and Blender (for rendering); however, the code is kept modular to potentially support different rendering backends.
## Getting started
For instructions, please refer to [https://kubric.readthedocs.io](https://kubric.readthedocs.io)
Assuming you have docker installed, to generate the data above simply execute:
```
git clone https://github.com/google-research/kubric.git
cd kubric
docker pull kubricdockerhub/kubruntu
docker run --rm --interactive \
--user $(id -u):$(id -g) \
--volume "$(pwd):/kubric" \
kubricdockerhub/kubruntu \
/usr/bin/python3 examples/helloworld.py
ls output
```
Kubric employs **Blender 2.93** (see [here](https://github.com/google-research/kubric/blob/01a08d274234f32f2adc4f7d5666b39490f953ad/docker/Blender.Dockerfile#L48)), so if you want to inspect the generated `*.blend` scene file for interactive inspection (i.e. without needing to render the scene), please make sure you have installed the correct Blender version.
## Requirements
- A pipeline for conveniently generating video data.
- Physics simulation for automatically generating physical interactions between multiple objects.
- Good control over the complexity of the generated data, so that we can evaluate individual aspects such as variability of objects and textures.
- Realism: Ideally, the ability to span the entire complexity range from CLEVR all the way to real-world video such as YouTube8. This is clearly not feasible, but we would like to get as close as possible.
- Access to rich ground truth information about the objects in a scene for the purpose of evaluation (eg. object segmentations and properties)
- Control the train/test split to evaluate compositionality and systematic generalization (for example on held-out combinations of features or objects)
## Challenges and datasets
Generally, we store datasets for the challenges in this [Google Cloud Bucket](https://console.cloud.google.com/storage/browser/kubric-public).
More specifically, these challenges are *dataset contributions* of the Kubric CVPR'22 paper:
* [MOVi: Multi-Object Video](challenges/movi)
* [Texture-Structure in NeRF](challenges/texture_structure_nerf)
* [Optical Flow](challenges/optical_flow)
* [Pre-training Visual Representations](challenges/pretraining_visual)
* [Robust NeRF](challenges/robust_nerf)
* [Multi-View Object Matting](challenges/multiview_matting)
* [Complex BRDFs](challenges/complex_brdf)
* [Single View Reconstruction](challenges/single_view_reconstruction)
* [Video Based Reconstruction](challenges/video_based_reconstruction)
* [Point Tracking](challenges/point_tracking)
Pointers to additional datasets/workers:
* [ToyBox (from Neural Semantic Fields)](https://nesf3d.github.io)
* [MultiShapeNet (from Scene Representation Transformer)](https://srt-paper.github.io)
* [SyntheticTrio(from Controllable Neural Radiance Fields)](https://github.com/kacperkan/conerf-kubric-dataset#readme)
## Disclaimer
This is not an official Google Product
|
TariqJamil/guanaco-llama2-2k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3212963
num_examples: 2000
download_size: 0
dataset_size: 3212963
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-2k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Joe02/uneo_refs | ---
license: other
---
|
DynamicSuperbPrivate/SpeakerVerification_LibrispeechTrainClean100 | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: file2
dtype: string
- name: instruction
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 6617191795.67
num_examples: 28539
- name: validation
num_bytes: 359547975.058
num_examples: 2703
download_size: 6771822691
dataset_size: 6976739770.728
---
# Dataset Card for "SpeakerVerification_LibrispeechTrainClean100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_262 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 16839327456.75
num_examples: 175322
download_size: 14594396458
dataset_size: 16839327456.75
---
# Dataset Card for "chunk_262"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davanstrien/autotrain-data-text-class | Invalid username or password. |
mweiss/fashion_mnist_ambiguous | ---
license: mit
task_categories:
- image-classification
language:
- en
pretty_name: mnist_ambigous
size_categories:
- 10K<n<100K
source_datasets:
- extended|mnist
annotations_creators:
- machine-generated
---
# Fashion-Mnist-Ambiguous
This dataset contains fashion-mnist-like images, but with an unclear ground truth. For each image, there are two classes that could be considered true.
Robust and uncertainty-aware DNNs should thus detect and flag these issues.
### Features
Same as fashion-mnist, the supervised dataset has an `image` (28x28 int array) and a `label` (int).
Additionally, the following features are exposed for your convenience:
- `text_label` (str): A textual representation of the probabilistic label, e.g. `p(Pullover)=0.54, p(Shirt)=0.46`
- `p_label` (list of floats): Ground-Truth probabilities for each class (two nonzero values for our ambiguous images)
- `is_ambiguous` (bool): Flag indicating if this is one of our ambiguous images (see 'splits' below)
### Splits
We provide four splits:
- `test`: 10'000 ambiguous images
- `train`: 10'000 ambiguous images - adding ambiguous images to the training set makes sure test-time ambiguous images are in-distribution.
- `test_mixed`: 20'000 images, consisting of the (shuffled) concatenation of our ambiguous `test` set and the nominal *original* fashion mnist test set
- `train_mixed`: 70'000 images, consisting of the (shuffled) concatenation of our ambiguous `training` and the nominal training set.
Note that the ambiguous train images are highly ambiguous (i.e., the two classes have very similar ground truth likelihoods),
the training set images allow for more unbalanced ambiguity.
This is to make the training set more closely connected to the nominal data, while still keeping the test set clearly ambiguous.
For research targeting explicitly aleatoric uncertainty, we recommend training the model using `train_mixed`.
Otherwise, our `test` set will lead to both epistemic and aleatoric uncertainty.
In related literature, such 'mixed' splits are sometimes denoted as *dirty* splits.
### Assessment and Validity
For a brief discussion of the strength and weaknesses of this dataset we refer to our paper.
Please note that our images are not typically realistic -
i.e., while they represent multiple classes and thus have an ambiguous ground truth, they do not resemble real-world photographs.
### Paper
Pre-print here: [https://arxiv.org/abs/2207.10495](https://arxiv.org/abs/2207.10495)
Citation:
```
@misc{https://doi.org/10.48550/arxiv.2207.10495,
doi = {10.48550/ARXIV.2207.10495},
url = {https://arxiv.org/abs/2207.10495},
author = {Weiss, Michael and Gómez, André García and Tonella, Paolo},
title = {A Forgotten Danger in DNN Supervision Testing: Generating and Detecting True Ambiguity},
publisher = {arXiv},
year = {2022}
}
```
### Related Datasets
- Ambiguous Mnist Dataset: [https://huggingface.co/datasets/mweiss/mnist_ambiguous](https://huggingface.co/datasets/mweiss/mnist_ambiguous)
- Corrupted Fashion-Mnist Dataset: [https://huggingface.co/datasets/mweiss/fashion_mnist_corrupted](https://huggingface.co/datasets/mweiss/fashion_mnist_corrupted)
|
Matheyyus/newe | ---
license: openrail
---
|
maxiannunziata/clipping | ---
language:
- es
--- |
Codec-SUPERB/iemocap_synth | ---
configs:
- config_name: default
data_files:
- split: original
path: data/original-*
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k_12bps
path: data/encodec_24k_12bps-*
- split: encodec_24k_1_5bps
path: data/encodec_24k_1_5bps-*
- split: encodec_24k_24bps
path: data/encodec_24k_24bps-*
- split: encodec_24k_3bps
path: data/encodec_24k_3bps-*
- split: encodec_24k_6bps
path: data/encodec_24k_6bps-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: id
dtype: string
splits:
- name: original
num_bytes: 805534560.953
num_examples: 5531
- name: academicodec_hifi_16k_320d
num_bytes: 803935882.953
num_examples: 5531
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 803935882.953
num_examples: 5531
- name: academicodec_hifi_24k_320d
num_bytes: 1206479242.953
num_examples: 5531
- name: audiodec_24k_320d
num_bytes: 1209758482.953
num_examples: 5531
- name: dac_16k
num_bytes: 805722614.953
num_examples: 5531
- name: dac_24k
num_bytes: 1208292384.953
num_examples: 5531
- name: dac_44k
num_bytes: 2219743798.953
num_examples: 5531
- name: encodec_24k_12bps
num_bytes: 1208292384.953
num_examples: 5531
- name: encodec_24k_1_5bps
num_bytes: 1208292384.953
num_examples: 5531
- name: encodec_24k_24bps
num_bytes: 1208292384.953
num_examples: 5531
- name: encodec_24k_3bps
num_bytes: 1208292384.953
num_examples: 5531
- name: encodec_24k_6bps
num_bytes: 1208292384.953
num_examples: 5531
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 805275064.953
num_examples: 5531
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 805275064.953
num_examples: 5531
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 805722614.953
num_examples: 5531
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 805722614.953
num_examples: 5531
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 805722614.953
num_examples: 5531
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 805722614.953
num_examples: 5531
- name: speech_tokenizer_16k
num_bytes: 807018762.953
num_examples: 5531
download_size: 20057832574
dataset_size: 20745324129.05999
---
# Dataset Card for "iemocap_synth"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
oscar-corpus/OSCAR-2109 | ---
pretty_name: OSCAR
annotations_creators:
- no-annotation
language_creators:
- found
language:
- af
- als
- gsw
- am
- an
- ar
- arz
- as
- ast
- av
- az
- azb
- ba
- bar
- be
- bg
- bh
- bn
- bo
- bpy
- br
- bs
- bxr
- ca
- cbk
- ce
- ceb
- ckb
- cs
- cv
- cy
- da
- de
- diq
- dsb
- dv
- el
- eml
- en
- eo
- es
- et
- eu
- fa
- fi
- fr
- frr
- fy
- ga
- gd
- gl
- gn
- gom
- gu
- gv
- he
- hi
- hr
- hsb
- ht
- hu
- hy
- ia
- id
- ie
- ilo
- io
- is
- it
- ja
- jbo
- jv
- ka
- kk
- km
- kn
- ko
- krc
- ku
- kv
- kw
- ky
- la
- lb
- lez
- li
- lmo
- lo
- lrc
- lt
- lv
- mai
- mg
- mhr
- min
- mk
- ml
- mn
- mr
- mrj
- ms
- mt
- mwl
- my
- myv
- mzn
- nah
- nap
- nds
- ne
- new
- nl
- nn
- 'no'
- oc
- or
- os
- pa
- pam
- pl
- pms
- pnb
- ps
- pt
- qu
- rm
- ro
- ru
- rue
- sa
- sah
- scn
- sco
- sd
- sh
- si
- sk
- sl
- so
- sq
- sr
- su
- sv
- sw
- ta
- te
- tg
- th
- tk
- tl
- tr
- tt
- tyv
- ug
- uk
- ur
- uz
- vec
- vi
- vls
- vo
- wa
- war
- wuu
- xal
- xmf
- yi
- yo
- zh
license:
- cc0-1.0
multilinguality:
- multilingual
size_categories:
unshuffled_deduplicated_af:
- 100K<n<1M
unshuffled_deduplicated_als:
- 1K<n<10K
unshuffled_deduplicated_am:
- 10K<n<100K
unshuffled_deduplicated_an:
- 1K<n<10K
unshuffled_deduplicated_ar:
- 1M<n<10M
unshuffled_deduplicated_arz:
- 10K<n<100K
unshuffled_deduplicated_as:
- 1K<n<10K
unshuffled_deduplicated_ast:
- 1K<n<10K
unshuffled_deduplicated_av:
- n<1K
unshuffled_deduplicated_az:
- 100K<n<1M
unshuffled_deduplicated_azb:
- 1K<n<10K
unshuffled_deduplicated_ba:
- 10K<n<100K
unshuffled_deduplicated_bar:
- n<1K
unshuffled_deduplicated_bcl:
- n<1K
unshuffled_deduplicated_be:
- 100K<n<1M
unshuffled_deduplicated_bg:
- 1M<n<10M
unshuffled_deduplicated_bh:
- n<1K
unshuffled_deduplicated_bn:
- 1M<n<10M
unshuffled_deduplicated_bo:
- 10K<n<100K
unshuffled_deduplicated_bpy:
- 1K<n<10K
unshuffled_deduplicated_br:
- 10K<n<100K
unshuffled_deduplicated_bs:
- n<1K
unshuffled_deduplicated_bxr:
- n<1K
unshuffled_deduplicated_ca:
- 1M<n<10M
unshuffled_deduplicated_cbk:
- n<1K
unshuffled_deduplicated_ce:
- 1K<n<10K
unshuffled_deduplicated_ceb:
- 10K<n<100K
unshuffled_deduplicated_ckb:
- 10K<n<100K
unshuffled_deduplicated_cs:
- 10M<n<100M
unshuffled_deduplicated_cv:
- 10K<n<100K
unshuffled_deduplicated_cy:
- 10K<n<100K
unshuffled_deduplicated_da:
- 1M<n<10M
unshuffled_deduplicated_de:
- 10M<n<100M
unshuffled_deduplicated_diq:
- n<1K
unshuffled_deduplicated_dsb:
- n<1K
unshuffled_deduplicated_dv:
- 10K<n<100K
unshuffled_deduplicated_el:
- 1M<n<10M
unshuffled_deduplicated_eml:
- n<1K
unshuffled_deduplicated_en:
- 100M<n<1B
unshuffled_deduplicated_eo:
- 10K<n<100K
unshuffled_deduplicated_es:
- 10M<n<100M
unshuffled_deduplicated_et:
- 1M<n<10M
unshuffled_deduplicated_eu:
- 100K<n<1M
unshuffled_deduplicated_fa:
- 1M<n<10M
unshuffled_deduplicated_fi:
- 1M<n<10M
unshuffled_deduplicated_fr:
- 10M<n<100M
unshuffled_deduplicated_frr:
- n<1K
unshuffled_deduplicated_fy:
- 10K<n<100K
unshuffled_deduplicated_ga:
- 10K<n<100K
unshuffled_deduplicated_gd:
- 1K<n<10K
unshuffled_deduplicated_gl:
- 100K<n<1M
unshuffled_deduplicated_gn:
- n<1K
unshuffled_deduplicated_gom:
- n<1K
unshuffled_deduplicated_gu:
- 100K<n<1M
unshuffled_deduplicated_he:
- 1M<n<10M
unshuffled_deduplicated_hi:
- 1M<n<10M
unshuffled_deduplicated_hr:
- 100K<n<1M
unshuffled_deduplicated_hsb:
- 1K<n<10K
unshuffled_deduplicated_ht:
- n<1K
unshuffled_deduplicated_hu:
- 1M<n<10M
unshuffled_deduplicated_hy:
- 100K<n<1M
unshuffled_deduplicated_ia:
- n<1K
unshuffled_deduplicated_id:
- 1M<n<10M
unshuffled_deduplicated_ie:
- n<1K
unshuffled_deduplicated_ilo:
- 1K<n<10K
unshuffled_deduplicated_io:
- n<1K
unshuffled_deduplicated_is:
- 100K<n<1M
unshuffled_deduplicated_it:
- 10M<n<100M
unshuffled_deduplicated_ja:
- 10M<n<100M
unshuffled_deduplicated_jbo:
- n<1K
unshuffled_deduplicated_jv:
- 1K<n<10K
unshuffled_deduplicated_ka:
- 100K<n<1M
unshuffled_deduplicated_kk:
- 100K<n<1M
unshuffled_deduplicated_km:
- 100K<n<1M
unshuffled_deduplicated_kn:
- 100K<n<1M
unshuffled_deduplicated_ko:
- 1M<n<10M
unshuffled_deduplicated_krc:
- 1K<n<10K
unshuffled_deduplicated_ku:
- 10K<n<100K
unshuffled_deduplicated_kv:
- n<1K
unshuffled_deduplicated_kw:
- n<1K
unshuffled_deduplicated_ky:
- 10K<n<100K
unshuffled_deduplicated_la:
- 10K<n<100K
unshuffled_deduplicated_lb:
- 10K<n<100K
unshuffled_deduplicated_lez:
- 1K<n<10K
unshuffled_deduplicated_li:
- n<1K
unshuffled_deduplicated_lmo:
- 1K<n<10K
unshuffled_deduplicated_lo:
- 10K<n<100K
unshuffled_deduplicated_lrc:
- n<1K
unshuffled_deduplicated_lt:
- 1M<n<10M
unshuffled_deduplicated_lv:
- 100K<n<1M
unshuffled_deduplicated_mai:
- n<1K
unshuffled_deduplicated_mg:
- 10K<n<100K
unshuffled_deduplicated_mhr:
- 1K<n<10K
unshuffled_deduplicated_min:
- n<1K
unshuffled_deduplicated_mk:
- 100K<n<1M
unshuffled_deduplicated_ml:
- 100K<n<1M
unshuffled_deduplicated_mn:
- 100K<n<1M
unshuffled_deduplicated_mr:
- 100K<n<1M
unshuffled_deduplicated_mrj:
- n<1K
unshuffled_deduplicated_ms:
- 100K<n<1M
unshuffled_deduplicated_mt:
- 10K<n<100K
unshuffled_deduplicated_mwl:
- n<1K
unshuffled_deduplicated_my:
- 100K<n<1M
unshuffled_deduplicated_myv:
- n<1K
unshuffled_deduplicated_mzn:
- n<1K
unshuffled_deduplicated_nah:
- n<1K
unshuffled_deduplicated_nap:
- n<1K
unshuffled_deduplicated_nds:
- 1K<n<10K
unshuffled_deduplicated_ne:
- 100K<n<1M
unshuffled_deduplicated_new:
- 1K<n<10K
unshuffled_deduplicated_nl:
- 10M<n<100M
unshuffled_deduplicated_nn:
- 100K<n<1M
unshuffled_deduplicated_no:
- 1M<n<10M
unshuffled_deduplicated_oc:
- 1K<n<10K
unshuffled_deduplicated_or:
- 10K<n<100K
unshuffled_deduplicated_os:
- 1K<n<10K
unshuffled_deduplicated_pa:
- 10K<n<100K
unshuffled_deduplicated_pam:
- n<1K
unshuffled_deduplicated_pl:
- 10M<n<100M
unshuffled_deduplicated_pms:
- 1K<n<10K
unshuffled_deduplicated_pnb:
- 1K<n<10K
unshuffled_deduplicated_ps:
- 10K<n<100K
unshuffled_deduplicated_pt:
- 10M<n<100M
unshuffled_deduplicated_qu:
- n<1K
unshuffled_deduplicated_rm:
- n<1K
unshuffled_deduplicated_ro:
- 1M<n<10M
unshuffled_deduplicated_ru:
- 100M<n<1B
unshuffled_deduplicated_sa:
- 1K<n<10K
unshuffled_deduplicated_sah:
- 1K<n<10K
unshuffled_deduplicated_scn:
- n<1K
unshuffled_deduplicated_sd:
- 10K<n<100K
unshuffled_deduplicated_sh:
- 10K<n<100K
unshuffled_deduplicated_si:
- 100K<n<1M
unshuffled_deduplicated_sk:
- 1M<n<10M
unshuffled_deduplicated_sl:
- 100K<n<1M
unshuffled_deduplicated_so:
- n<1K
unshuffled_deduplicated_sq:
- 100K<n<1M
unshuffled_deduplicated_sr:
- 100K<n<1M
unshuffled_deduplicated_su:
- n<1K
unshuffled_deduplicated_sv:
- 10M<n<100M
unshuffled_deduplicated_sw:
- 10K<n<100K
unshuffled_deduplicated_ta:
- 100K<n<1M
unshuffled_deduplicated_te:
- 100K<n<1M
unshuffled_deduplicated_tg:
- 10K<n<100K
unshuffled_deduplicated_th:
- 1M<n<10M
unshuffled_deduplicated_tk:
- 1K<n<10K
unshuffled_deduplicated_tl:
- 100K<n<1M
unshuffled_deduplicated_tr:
- 10M<n<100M
unshuffled_deduplicated_tt:
- 10K<n<100K
unshuffled_deduplicated_tyv:
- n<1K
unshuffled_deduplicated_ug:
- 10K<n<100K
unshuffled_deduplicated_uk:
- 1M<n<10M
unshuffled_deduplicated_ur:
- 100K<n<1M
unshuffled_deduplicated_uz:
- 10K<n<100K
unshuffled_deduplicated_vec:
- n<1K
unshuffled_deduplicated_vi:
- 1M<n<10M
unshuffled_deduplicated_vo:
- 1K<n<10K
unshuffled_deduplicated_wa:
- n<1K
unshuffled_deduplicated_war:
- 1K<n<10K
unshuffled_deduplicated_wuu:
- n<1K
unshuffled_deduplicated_xal:
- n<1K
unshuffled_deduplicated_xmf:
- 1K<n<10K
unshuffled_deduplicated_yi:
- 10K<n<100K
unshuffled_deduplicated_yo:
- n<1K
unshuffled_deduplicated_yue:
- n<1K
unshuffled_deduplicated_zh:
- 10M<n<100M
unshuffled_original_af:
- 100K<n<1M
unshuffled_original_als:
- 1K<n<10K
unshuffled_original_am:
- 10K<n<100K
unshuffled_original_an:
- 1K<n<10K
unshuffled_original_ar:
- 10M<n<100M
unshuffled_original_arz:
- 100K<n<1M
unshuffled_original_as:
- 10K<n<100K
unshuffled_original_ast:
- 1K<n<10K
unshuffled_original_av:
- n<1K
unshuffled_original_az:
- 100K<n<1M
unshuffled_original_azb:
- 10K<n<100K
unshuffled_original_ba:
- 10K<n<100K
unshuffled_original_bar:
- n<1K
unshuffled_original_bcl:
- n<1K
unshuffled_original_be:
- 100K<n<1M
unshuffled_original_bg:
- 1M<n<10M
unshuffled_original_bh:
- n<1K
unshuffled_original_bn:
- 1M<n<10M
unshuffled_original_bo:
- 10K<n<100K
unshuffled_original_bpy:
- 1K<n<10K
unshuffled_original_br:
- 10K<n<100K
unshuffled_original_bs:
- 1K<n<10K
unshuffled_original_bxr:
- n<1K
unshuffled_original_ca:
- 1M<n<10M
unshuffled_original_cbk:
- n<1K
unshuffled_original_ce:
- 1K<n<10K
unshuffled_original_ceb:
- 10K<n<100K
unshuffled_original_ckb:
- 100K<n<1M
unshuffled_original_cs:
- 10M<n<100M
unshuffled_original_cv:
- 10K<n<100K
unshuffled_original_cy:
- 100K<n<1M
unshuffled_original_da:
- 1M<n<10M
unshuffled_original_de:
- 100M<n<1B
unshuffled_original_diq:
- n<1K
unshuffled_original_dsb:
- n<1K
unshuffled_original_dv:
- 10K<n<100K
unshuffled_original_el:
- 10M<n<100M
unshuffled_original_eml:
- n<1K
unshuffled_original_en:
- 100M<n<1B
unshuffled_original_eo:
- 100K<n<1M
unshuffled_original_es:
- 10M<n<100M
unshuffled_original_et:
- 1M<n<10M
unshuffled_original_eu:
- 100K<n<1M
unshuffled_original_fa:
- 10M<n<100M
unshuffled_original_fi:
- 1M<n<10M
unshuffled_original_fr:
- 10M<n<100M
unshuffled_original_frr:
- n<1K
unshuffled_original_fy:
- 10K<n<100K
unshuffled_original_ga:
- 10K<n<100K
unshuffled_original_gd:
- 1K<n<10K
unshuffled_original_gl:
- 100K<n<1M
unshuffled_original_gn:
- n<1K
unshuffled_original_gom:
- n<1K
unshuffled_original_gu:
- 100K<n<1M
unshuffled_original_he:
- 1M<n<10M
unshuffled_original_hi:
- 1M<n<10M
unshuffled_original_hr:
- 100K<n<1M
unshuffled_original_hsb:
- 1K<n<10K
unshuffled_original_ht:
- n<1K
unshuffled_original_hu:
- 10M<n<100M
unshuffled_original_hy:
- 100K<n<1M
unshuffled_original_ia:
- 1K<n<10K
unshuffled_original_id:
- 10M<n<100M
unshuffled_original_ie:
- n<1K
unshuffled_original_ilo:
- 1K<n<10K
unshuffled_original_io:
- n<1K
unshuffled_original_is:
- 100K<n<1M
unshuffled_original_it:
- 10M<n<100M
unshuffled_original_ja:
- 10M<n<100M
unshuffled_original_jbo:
- n<1K
unshuffled_original_jv:
- 1K<n<10K
unshuffled_original_ka:
- 100K<n<1M
unshuffled_original_kk:
- 100K<n<1M
unshuffled_original_km:
- 100K<n<1M
unshuffled_original_kn:
- 100K<n<1M
unshuffled_original_ko:
- 1M<n<10M
unshuffled_original_krc:
- 1K<n<10K
unshuffled_original_ku:
- 10K<n<100K
unshuffled_original_kv:
- 1K<n<10K
unshuffled_original_kw:
- n<1K
unshuffled_original_ky:
- 100K<n<1M
unshuffled_original_la:
- 10K<n<100K
unshuffled_original_lb:
- 10K<n<100K
unshuffled_original_lez:
- 1K<n<10K
unshuffled_original_li:
- n<1K
unshuffled_original_lmo:
- 1K<n<10K
unshuffled_original_lo:
- 10K<n<100K
unshuffled_original_lrc:
- n<1K
unshuffled_original_lt:
- 1M<n<10M
unshuffled_original_lv:
- 1M<n<10M
unshuffled_original_mai:
- n<1K
unshuffled_original_mg:
- 10K<n<100K
unshuffled_original_mhr:
- 1K<n<10K
unshuffled_original_min:
- n<1K
unshuffled_original_mk:
- 100K<n<1M
unshuffled_original_ml:
- 100K<n<1M
unshuffled_original_mn:
- 100K<n<1M
unshuffled_original_mr:
- 100K<n<1M
unshuffled_original_mrj:
- n<1K
unshuffled_original_ms:
- 100K<n<1M
unshuffled_original_mt:
- 10K<n<100K
unshuffled_original_mwl:
- n<1K
unshuffled_original_my:
- 100K<n<1M
unshuffled_original_myv:
- n<1K
unshuffled_original_mzn:
- 1K<n<10K
unshuffled_original_nah:
- n<1K
unshuffled_original_nap:
- n<1K
unshuffled_original_nds:
- 10K<n<100K
unshuffled_original_ne:
- 100K<n<1M
unshuffled_original_new:
- 1K<n<10K
unshuffled_original_nl:
- 10M<n<100M
unshuffled_original_nn:
- 100K<n<1M
unshuffled_original_no:
- 1M<n<10M
unshuffled_original_oc:
- 10K<n<100K
unshuffled_original_or:
- 10K<n<100K
unshuffled_original_os:
- 1K<n<10K
unshuffled_original_pa:
- 100K<n<1M
unshuffled_original_pam:
- n<1K
unshuffled_original_pl:
- 10M<n<100M
unshuffled_original_pms:
- 1K<n<10K
unshuffled_original_pnb:
- 1K<n<10K
unshuffled_original_ps:
- 10K<n<100K
unshuffled_original_pt:
- 10M<n<100M
unshuffled_original_qu:
- n<1K
unshuffled_original_rm:
- n<1K
unshuffled_original_ro:
- 1M<n<10M
unshuffled_original_ru:
- 100M<n<1B
unshuffled_original_sa:
- 10K<n<100K
unshuffled_original_sah:
- 10K<n<100K
unshuffled_original_scn:
- n<1K
unshuffled_original_sd:
- 10K<n<100K
unshuffled_original_sh:
- 10K<n<100K
unshuffled_original_si:
- 100K<n<1M
unshuffled_original_sk:
- 1M<n<10M
unshuffled_original_sl:
- 1M<n<10M
unshuffled_original_so:
- n<1K
unshuffled_original_sq:
- 100K<n<1M
unshuffled_original_sr:
- 1M<n<10M
unshuffled_original_su:
- n<1K
unshuffled_original_sv:
- 10M<n<100M
unshuffled_original_sw:
- 10K<n<100K
unshuffled_original_ta:
- 1M<n<10M
unshuffled_original_te:
- 100K<n<1M
unshuffled_original_tg:
- 10K<n<100K
unshuffled_original_th:
- 1M<n<10M
unshuffled_original_tk:
- 1K<n<10K
unshuffled_original_tl:
- 100K<n<1M
unshuffled_original_tr:
- 10M<n<100M
unshuffled_original_tt:
- 100K<n<1M
unshuffled_original_tyv:
- n<1K
unshuffled_original_ug:
- 10K<n<100K
unshuffled_original_uk:
- 10M<n<100M
unshuffled_original_ur:
- 100K<n<1M
unshuffled_original_uz:
- 10K<n<100K
unshuffled_original_vec:
- n<1K
unshuffled_original_vi:
- 10M<n<100M
unshuffled_original_vo:
- 1K<n<10K
unshuffled_original_wa:
- 1K<n<10K
unshuffled_original_war:
- 1K<n<10K
unshuffled_original_wuu:
- n<1K
unshuffled_original_xal:
- n<1K
unshuffled_original_xmf:
- 1K<n<10K
unshuffled_original_yi:
- 10K<n<100K
unshuffled_original_yo:
- n<1K
unshuffled_original_yue:
- n<1K
unshuffled_original_zh:
- 10M<n<100M
source_datasets:
- original
task_categories:
- sequence-modeling
task_ids:
- language-modeling
paperswithcode_id: oscar
---
# Dataset Card for "oscar"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://oscar-corpus.com](https://oscar-corpus.com)
- **Repository:** [github.com/oscar-corpus/corpus](https://github.com/oscar-corpus/corpus)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Dataset Summary
OSCAR or **O**pen **S**uper-large **C**rawled **A**ggregated co**R**pus is a huge multilingual corpus obtained by language classification and filtering of the [Common Crawl](https://commoncrawl.org/) corpus using the [ungoliant](https://github.com/oscar-corpus/ungoliant) architecture. Data is distributed by language in both original and deduplicated form.
### Supported Tasks and Leaderboards
OSCAR is mainly inteded to pretrain language models and word represantations.
### Languages
All the data is distributed by language, both the original and the deduplicated versions of the data are available. 168 different languages are available. The table in subsection [Data Splits Sample Size](#data-splits-sample-size) provides the language code for each subcorpus as well as the number of words (space separated tokens), lines and sizes for both the original and the deduplicated versions of OSCAR.
### Issues
OSCAR 21.09 has known issues regarding specific languages.
Note that other issues may (and could) be present in other languages.
**If you encounter something that is unexpected, please file an issue here: https://github.com/oscar-corpus/corpus/issues.**
|Language code|Language|Issues|
|-------------|--------|------|
|`tg`|Tajik|[](https://github.com/oscar-corpus/corpus/issues?q=is%3Aissue+is%3Aopen+label%3Alang%3Atg+label%3Aver%3A21.09)|
|`tr`|Turkish|[](https://github.com/oscar-corpus/corpus/issues?q=is%3Aissue+is%3Aopen+label%3Alang%3Atr+label%3Aver%3A21.09)|
|`vls`|West Flemish|[](https://github.com/oscar-corpus/corpus/issues?q=is%3Aopen+label%3Alang%3Avls+label%3Aver%3A21.09)|
|`wuu`|Wu Chinese|[](https://github.com/oscar-corpus/corpus/issues?q=is%3Aissue+is%3Aopen+label%3Alang%3Awuu+label%3Aver%3A21.09)|
|`nap`|Neapolitan|[](https://github.com/oscar-corpus/corpus/issues?q=is%3Aissue+is%3Aopen+label%3Alang%3Anap+label%3Aver%3A21.09)|
|`so`|Somali|[](https://github.com/oscar-corpus/corpus/issues?q=is%3Aissue+is%3Aopen+label%3Alang%3Aso+label%3Aver%3A21.09)|
|`frr`|Northern Frisian|[](https://github.com/oscar-corpus/corpus/issues?q=is%3Aissue+is%3Aopen+label%3Alang%3Afrr+label%3Aver%3A21.09)|
|`cbk`|Chavacano|[](https://github.com/oscar-corpus/corpus/issues?q=is%3Aissue+is%3Aopen+label%3Alang%3Acbk+label%3Aver%3A21.09)|
|`sco`|Scots|[](https://github.com/oscar-corpus/corpus/issues?q=is%3Aissue+is%3Aopen+label%3Alang%3Asco+label%3Aver%3A21.09)|
## Dataset Structure
We show detailed information for all the configurations of the dataset.
### Data Instances
<details>
<summary>Click to expand the Data/size information for each language (deduplicated)</summary>
#### deduplicated_af
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 3287,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:BUOBNDDY3VZKNNUOY33PAWBXEVNDCDJK',
'warc-date': '2021-03-09T04:21:33Z',
'warc-identified-content-language': 'afr,eng',
'warc-record-id': '<urn:uuid:dece1e30-a099-411a-87fd-483791342d48>',
'warc-refers-to': '<urn:uuid:5a35e8b2-0fcb-4600-9d15-f5c6469ddf01>',
'warc-target-uri': 'http://www.northwestnewspapers.co.za/gemsbok/2015-06-18-10-02-17/hoe-om-n-ad-te-plaas/1907-man-betrap-met-jagluiperd-en-leeu-bene',
'warc-type': 'conversion'},
'nb_sentences': 3,
'offset': 0},
'text': 'Stap 2: Tik jou ad in die teks boksie, jy sal sien dat die prys aan '
'die regterkant van die boksie verander volgens di...'}
```
#### deduplicated_als
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 4607,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:URQ53Z4I4KGPHICZYLW2ZOX7OWWCGZUA',
'warc-date': '2021-03-03T16:09:20Z',
'warc-identified-content-language': 'deu,eng',
'warc-record-id': '<urn:uuid:134499db-d54a-4c29-9517-350cacc3d29d>',
'warc-refers-to': '<urn:uuid:073aeb77-b4ed-47eb-b955-27031963acf4>',
'warc-target-uri': 'https://als.m.wikipedia.org/wiki/Neukaledonien',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'D Wirtschaft bestoot vor allem us Handwärk, Bärgbau, Industrii und '
'Turismus. 40 Kilometer vo dr Hauptstadt Nouméa äwä...'}
```
#### deduplicated_am
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 9679,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:YADJOQVUOQHUKJ7BXCKKU4LRFKE3JPOA',
'warc-date': '2021-03-09T04:16:32Z',
'warc-identified-content-language': 'amh,eng',
'warc-record-id': '<urn:uuid:fa02fe22-c72e-42e8-9cb3-89da85a80941>',
'warc-refers-to': '<urn:uuid:ff89f862-5e6a-41aa-bc40-ef1d2f91d258>',
'warc-target-uri': 'http://ethioforum.ethiopiaforums.com/viewtopic.php?f=6&t=3874&p=6511',
'warc-type': 'conversion'},
'nb_sentences': 10,
'offset': 0},
'text': '(ፍኖተ ነፃነት) በኢትዮጵያ የአዉሮፓ ሕብረት ልኡካን ቡድን መሪ አምባሳደር ቻንታል ሔበሬሽ፣ በአዉሮፓ '
'ሕብረት የአፍሪካ ቀንድ እና የሕንድ ዉቂያኖስ አካባቢ ዴስክ ኦፌሴር ቪክቶሪያ ጋርሲ...'}
```
#### deduplicated_an
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 134014,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:OG2T3MJFSLSH33PVI7D3WPXVE6ZFLZ4Z',
'warc-date': '2021-03-08T00:58:33Z',
'warc-identified-content-language': 'ara,fra',
'warc-record-id': '<urn:uuid:0ef1d002-86e7-49c1-ac8a-8ba933d190ee>',
'warc-refers-to': '<urn:uuid:5071f1f7-3350-406d-ad97-f292fe7a2ff0>',
'warc-target-uri': 'http://dorous.ek.la/1-5-a6032874?reply_comm=68653652',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'ووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووووو...'}
```
#### deduplicated_ar
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 12677,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:NFDDUGANGSGSFXIQAXEGIVHGRLFCUW55',
'warc-date': '2021-03-04T02:22:39Z',
'warc-identified-content-language': 'ara,eng',
'warc-record-id': '<urn:uuid:3ea1e651-68f3-4dde-bfea-7a12e5331084>',
'warc-refers-to': '<urn:uuid:dcecf9ad-1797-44d0-b06a-010c424ba396>',
'warc-target-uri': 'https://elmgals.net/?p=62804',
'warc-type': 'conversion'},
'nb_sentences': 2,
'offset': 0},
'text': 'مطحنة الكرة في ماسبات - orioloingeu. مطاحن الفرينة في مطحنة الكرة '
'مراكز بيع الة طحن التوابل بيع ألات لرحي اسعار بيع ا...'}
```
#### deduplicated_arz
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 9603,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:6O2LEGAWXAWYSRH2TQNYOWX47ZFWTKRC',
'warc-date': '2021-03-09T03:51:17Z',
'warc-identified-content-language': 'ara',
'warc-record-id': '<urn:uuid:0578411b-367f-4d52-b85c-56b4bb64c0be>',
'warc-refers-to': '<urn:uuid:8777119c-434c-49a1-80a8-f2b23fa0e21c>',
'warc-target-uri': 'https://www.hko-ommen.nl/Nov_01/605.html',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'مستعملة 4265 كسارات للبيع - كسارة الحجر. كسارات مستعمله للبيع فى '
'مصر. للبيع كسارات فى مصرمطلوب كسارات حجر مستعملة للب...'}
```
#### deduplicated_as
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 9280,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:DORQKORQ4TURDN35T75TW72IZ7IZIEFG',
'warc-date': '2021-03-03T15:06:57Z',
'warc-identified-content-language': 'asm,eng',
'warc-record-id': '<urn:uuid:fd6c3650-f91f-4f03-ae7a-bea654e043bb>',
'warc-refers-to': '<urn:uuid:48f057d6-f642-42d2-8de1-fec8e4fca4d4>',
'warc-target-uri': 'https://assam.nenow.in/%E0%A6%95%E0%A6%BE%E0%A6%87%E0%A6%B2%E0%A7%88%E0%A7%B0-%E0%A6%AA%E0%A7%B0%E0%A6%BE-%E0%A6%AF%E0%A7%8B%E0%A7%B0%E0%A6%B9%E0%A6%BE%E0%A6%9F%E0%A6%A4-%E0%A6%86%E0%A7%B0%E0%A6%AE%E0%A7%8D%E0%A6%AD/',
'warc-type': 'conversion'},
'nb_sentences': 8,
'offset': 0},
'text': 'যোৰহাট জিলাৰ এন আৰ চি উন্নিতকৰণৰ প্ৰথম পৰ্য্যায়ৰ বংশবৃক্ষ পৰীক্ষণৰ '
'কাম কাইলৈৰ পৰা পৰীক্ষামূলকভাৱে আৰু ১৯ ফেব্ৰুৱাৰিৰ ...'}
```
#### deduplicated_ast
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 3752,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:BU44BHPYU2BOWH4TUAY7ZOEBFVQ6KD44',
'warc-date': '2021-03-01T15:56:44Z',
'warc-identified-content-language': 'spa',
'warc-record-id': '<urn:uuid:2b3ca12f-6614-4662-a4e9-16e1ce13a8b0>',
'warc-refers-to': '<urn:uuid:0e132db0-e0f4-44c5-ab63-48b7594a35a6>',
'warc-target-uri': 'https://elsummum.es/tag/dial-traxel-pais/',
'warc-type': 'conversion'},
'nb_sentences': 2,
'offset': 0},
'text': 'Esta ye la galería d’imáxenes de los participantes nel concursu, el '
'xuráu y dellos miembros de la organización de la ...'}
```
#### deduplicated_av
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 2012,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:EULKS66PQCWWVXHNRPSISI72G3GFJD7L',
'warc-date': '2021-03-01T10:13:53Z',
'warc-identified-content-language': 'rus,eng',
'warc-record-id': '<urn:uuid:c2986179-7947-4184-9df5-dca05c987055>',
'warc-refers-to': '<urn:uuid:8b3e82e1-0964-4677-8b39-9bd3c67be25b>',
'warc-target-uri': 'http://gazetalevashi.ru/articles/media/2019/10/25/diktant-tiobitiana/',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Дагъистаналъул жамгIият рахьдал мацIал цIуниялде ва '
'церетIезариялде, тарих, гIадатал, маданият ва дагъистаналъул '
'халк...'}
```
#### deduplicated_az
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 59868,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:LDASIZ5NDJU6NRCJW7XCCI4QRLFIZZQX',
'warc-date': '2021-02-26T04:13:32Z',
'warc-identified-content-language': 'aze',
'warc-record-id': '<urn:uuid:a35cc521-926e-442d-b285-299ea4a3b72a>',
'warc-refers-to': '<urn:uuid:b60fd7ea-7056-4ebb-8ae5-eb02617ca8cd>',
'warc-target-uri': 'https://azrefs.org/iqtisadi-tesebbuslere-yardim-ictimai-birliyi-yerli-seviyyede-i.html',
'warc-type': 'conversion'},
'nb_sentences': 70,
'offset': 0},
'text': 'İQTİsadi TƏŞƏBBÜSLƏRƏ yardim iCTİMAİ BİRLİYİ Yerli səviyyədə içməli '
'su təchizatı sisteminin idarə olunması\n'
'Az1009, Az...'}
```
#### deduplicated_azb
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 5245,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:XWTKHZGKVJI6ZAIKSTOA4AOP5PCWI2SH',
'warc-date': '2021-03-05T13:35:27Z',
'warc-identified-content-language': 'fas,uzb,eng',
'warc-record-id': '<urn:uuid:41816fd7-985e-4e35-b79b-bf471e68dd80>',
'warc-refers-to': '<urn:uuid:5717a90d-021c-428b-a69d-45d6cb2fc692>',
'warc-target-uri': 'https://azb.wikipedia.org/wiki/%D8%A2%D9%85%D8%B3%D8%AA%D8%B1%D8%AF%D8%A7%D9%85_%D8%A8%DB%8C%D9%84%DB%8C%D9%85%E2%80%8C%DB%8C%D9%88%D8%B1%D8%AF%D9%88',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'یازی Creative Commons Attribution-ShareAlike '
'License;آلتیندا\u200cدیر آرتیق شرطلر آرتیریلا بیلر. آرتیق ایطلاعات '
'اوچون ایشل...'}
```
#### deduplicated_ba
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 9444,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:NRTIKDSYAPTPQ64CKKLNR6TFVUYG7CLR',
'warc-date': '2021-03-09T04:46:56Z',
'warc-identified-content-language': 'uig,eng',
'warc-record-id': '<urn:uuid:b69f43f4-0e19-4cad-b083-fce91a40f64b>',
'warc-refers-to': '<urn:uuid:3176da53-14ff-4f65-91e4-4d209e9c7190>',
'warc-target-uri': 'https://uyghurix.net/archives/date/2016/05?uls=us',
'warc-type': 'conversion'},
'nb_sentences': 3,
'offset': 0},
'text': 'линакис системисиниң көрүнмә йүзи барғансери ишлитишкә қулайлиқ '
'болуп, кәң ишлитиливатқан болсиму, әмили хизмәттә йән...'}
```
#### deduplicated_bar
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 105623,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:L7EXHEWTVKPV7BWPZJFKHM2TZ3ZNKPWC',
'warc-date': '2021-03-07T18:33:16Z',
'warc-identified-content-language': 'fra',
'warc-record-id': '<urn:uuid:578af8ce-2149-42e3-978c-5191caaaca8c>',
'warc-refers-to': '<urn:uuid:a7afc792-983c-43b7-9b5b-75b2dc5fcd77>',
'warc-target-uri': 'https://fr.readkong.com/page/automne-hiver-printemps-2017-8342349',
'warc-type': 'conversion'},
'nb_sentences': 3,
'offset': 0},
'text': ' '
'vo\n'
' ...'}
```
#### deduplicated_be
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 3159,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:TEJML7M4S55254DZU43DXXORKPZMKGUL',
'warc-date': '2021-03-09T05:47:09Z',
'warc-identified-content-language': 'bel,eng',
'warc-record-id': '<urn:uuid:e22883c9-5622-4a0e-b259-b5265e6e345a>',
'warc-refers-to': '<urn:uuid:7ec2102d-2645-4fd9-89b8-557762996439>',
'warc-target-uri': 'https://be-tarask.wikipedia.org/wiki/%D0%9A%D0%B0%D1%82%D1%8D%D0%B3%D0%BE%D1%80%D1%8B%D1%8F:%D0%9F%D1%80%D1%8D%D1%81%D0%BD%D0%B0%D1%8F_%D0%B2%D0%B0%D0%B4%D0%B0',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Гэты тэкст даступны на ўмовах ліцэнзіі Creative Commons '
'Attribution/Share-Alike 3.0; у асобных выпадках могуць ужывац...'}
```
#### deduplicated_bg
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 23651,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:QDAV5ZVRR2IGND4ANWTVOBPNO2POZUEQ',
'warc-date': '2021-03-08T21:47:20Z',
'warc-identified-content-language': 'bul',
'warc-record-id': '<urn:uuid:0e422a1d-ac8c-4f21-bb71-e5c65282f30c>',
'warc-refers-to': '<urn:uuid:0109dba6-8f1a-4047-bdd5-cbcc38de63a8>',
'warc-target-uri': 'http://europe.bg/bg/bulgariya-poluchava-resor-inovacii-i-mladezh',
'warc-type': 'conversion'},
'nb_sentences': 37,
'offset': 0},
'text': 'От хилядите кубинци и другите граждани на страните от СИВ, '
'командировани на строежа на АЕЦ-а, в Белене е останал само...'}
```
#### deduplicated_bh
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 9021,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:IN7PHDOP7MZD6RHN6KIJ7SXTY7VC76SK',
'warc-date': '2021-03-08T22:57:31Z',
'warc-identified-content-language': 'hin,eng',
'warc-record-id': '<urn:uuid:62e18c96-cd2c-461b-93d9-900d95eec89e>',
'warc-refers-to': '<urn:uuid:73ee6388-6f0a-460d-ac2e-bbc1a2b63bb4>',
'warc-target-uri': 'https://bh.wikipedia.org/wiki/%E0%A4%B6%E0%A5%8D%E0%A4%B0%E0%A5%87%E0%A4%A3%E0%A5%80:%E0%A4%B5%E0%A4%BF%E0%A4%95%E0%A4%BF%E0%A4%AA%E0%A5%80%E0%A4%A1%E0%A4%BF%E0%A4%AF%E0%A4%BE_%E0%A4%97%E0%A5%88%E0%A4%B0-%E0%A4%AE%E0%A5%81%E0%A4%95%E0%A5%8D%E0%A4%A4_%E0%A4%AB%E0%A4%BE%E0%A4%87%E0%A4%B2_%E0%A4%B5%E0%A5%88%E0%A4%A7_%E0%A4%AC%E0%A5%88%E0%A4%95%E0%A4%B2%E0%A4%BF%E0%A4%82%E0%A4%95_%E0%A4%95%E0%A5%87_%E0%A4%B8%E0%A4%BE%E0%A4%A5?from=Ea',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'ई एगो छुपावल गइल श्रेणी बाटे। ई पन्ना सभ पर तबले ना लउकी जबले कि '
'प्रयोगकर्ता के सेटिंग, छुपावल गइल श्रेणी देखावे खाति...'}
```
#### deduplicated_bn
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 36198,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:7QRYGJ3YDG7SBTFUVMMALFA6UWNDVLVY',
'warc-date': '2021-03-05T07:10:58Z',
'warc-identified-content-language': 'ben',
'warc-record-id': '<urn:uuid:050c0cdb-562c-49e5-bcb6-7e5350531ea6>',
'warc-refers-to': '<urn:uuid:a3749b59-4285-4e90-ba64-aa9d745c1f46>',
'warc-target-uri': 'https://www.kalerkantho.com/online/business/2020/12/06/982949',
'warc-type': 'conversion'},
'nb_sentences': 8,
'offset': 0},
'text': 'নিজস্ব সংবাদদাতা: গাড়ি নয় যেন মানুষের খাঁচা। নেই কোন ভালো বসার '
'আসন, যা আছে সেগুলো ভাঙ্গাচুরা, ময়লা ও ধুলাবালিতে ভর...'}
```
#### deduplicated_bo
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 5059,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:XHKOQL5IQBLCVBANFVH66ZZXJZHEEMYW',
'warc-date': '2021-03-03T15:06:26Z',
'warc-identified-content-language': 'zho,bod',
'warc-record-id': '<urn:uuid:3a406f8f-58cd-4990-ae6f-f63dff7e06e3>',
'warc-refers-to': '<urn:uuid:806c4a11-f8cd-49e8-bc22-cae5e0cf6ef2>',
'warc-target-uri': 'http://tcansee.com/goods.php?id=392',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': '所有分类 藏学名家名著 国内名家名著 国外名家名著政治 社会 法律 政治 法律 社会 经济文学 艺术 旅游 艺术 文学 旅游宗教 历史 '
'文化 宗教 历史 文化教育 童书 工具书 教辅 童书 工具书语言文字 语言研究 语言 文字期刊 社...'}
```
#### deduplicated_bpy
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 8270,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:POHCGWDC32KW74IE26NTJ2UMNX7QRBDB',
'warc-date': '2021-03-05T14:00:16Z',
'warc-identified-content-language': 'ben',
'warc-record-id': '<urn:uuid:d53007ee-ddbe-44e9-8253-235567d2960c>',
'warc-refers-to': '<urn:uuid:0409ce75-26bc-4a60-b08d-4e2b6174127e>',
'warc-target-uri': 'http://pobnapurup.gaibandha.gov.bd/site/page/5dc0a075-18fd-11e7-9461-286ed488c766/%E0%A6%95%E0%A6%BE%E0%A6%B0%E0%A7%8D%E0%A6%AF%E0%A6%BE%E0%A6%AC%E0%A6%B2%E0%A7%80',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'পবনাপুর ইউনিয়ন---কিশোরগাড়ী ইউনিয়নহোসেনপুর ইউনিয়নপলাশবাড়ী '
'ইউনিয়নবরিশাল ইউনিয়নমহদীপুর ইউনিয়নবেতকাপা ইউনিয়নপবনাপুর ইউনিয়...'}
```
#### deduplicated_br
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 3134,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:U353JBWLMC22GRYEIDN4WOSBUOIUMYQT',
'warc-date': '2021-02-24T21:00:25Z',
'warc-identified-content-language': 'bre',
'warc-record-id': '<urn:uuid:49d1650d-aaf5-43b9-b340-326746e88b31>',
'warc-refers-to': '<urn:uuid:04877e5f-6b86-497e-b39c-30a72683261f>',
'warc-target-uri': 'https://br.m.wiktionary.org/wiki/dont',
'warc-type': 'conversion'},
'nb_sentences': 2,
'offset': 0},
'text': 'Sellet e vez ouzh ar bajenn pe ar gevrenn-mañ evel un divraz da '
'glokaat e brezhoneg. Mar gouezit tra pe dra diwar-ben...'}
```
#### deduplicated_bs
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 8483,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:HS77KGP5HJKJASHMW6WSYV326BPGVM35',
'warc-date': '2021-02-24T18:13:58Z',
'warc-identified-content-language': 'bos,hrv',
'warc-record-id': '<urn:uuid:c12f1b14-4194-405e-a059-9af2f7146940>',
'warc-refers-to': '<urn:uuid:31bedcb4-265f-4aa3-8d2c-cfdc64c42325>',
'warc-target-uri': 'http://mojusk.ba/zastrasujuce-slike-tamnice-u-kojoj-je-skolski-domar-silovao-12-godisnjakinju/',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Predsjednica Evropske centralne banke Christine Lagarde izjavila je '
'da njen najveći strah nije da će Evropska...'}
```
#### deduplicated_bxr
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 6751,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:RELUZWSMYT63FAPLHP55SMNNCSXIQEDX',
'warc-date': '2021-02-26T07:18:33Z',
'warc-identified-content-language': 'mon,rus',
'warc-record-id': '<urn:uuid:efe8d9fa-4329-4479-aa56-43938e8e5370>',
'warc-refers-to': '<urn:uuid:bba3bfb2-b7c7-4605-9f49-34598eac9a5b>',
'warc-target-uri': 'http://soyol.ru/bur/yoho-zanshal/hoityn/',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Хүнэй бэе мүнхэ бэшэ. Һүнэһэнэй бэеымнай орхижо, түрэлөө '
'урилхадань, тэрэнэй хальһан боложо ябаһан бэемнай үхэнэ, газ...'}
```
#### deduplicated_ca
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 30591,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:DJYNCXSBI5JH4V3LKGE7YNQBL34E3W5G',
'warc-date': '2021-03-02T21:39:28Z',
'warc-identified-content-language': 'cat,eng',
'warc-record-id': '<urn:uuid:ec350f95-900b-4164-aab3-8a6451228d5b>',
'warc-refers-to': '<urn:uuid:4c8e31b8-3011-4a21-9591-39be0942e121>',
'warc-target-uri': 'https://ca.m.wikipedia.org/wiki/Regne_d%27Ayutthaya',
'warc-type': 'conversion'},
'nb_sentences': 33,
'offset': 0},
'text': "El regne d'Ayutthaya va ser un estat a Tailàndia que va existir de "
'1351 a 1767 governat per un rei. El rei Rāmadhipat...'}
```
#### deduplicated_cbk
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 151273,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:JCULI5BTSXOFUJYKZPPLMU5BZEZJZEVJ',
'warc-date': '2021-03-04T21:00:26Z',
'warc-identified-content-language': 'ita',
'warc-record-id': '<urn:uuid:ca25bd6b-9a5f-41b5-8b0f-ad437a545cee>',
'warc-refers-to': '<urn:uuid:ac67c26c-c62a-4c3d-9bd9-dd66a78a474f>',
'warc-target-uri': 'https://it.readkong.com/page/note-di-un-anno-di-lavoro-plural-3281543',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': ' '
'na '
'...'}
```
#### deduplicated_ce
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 5944,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:AXGWUWKZ5HO42LSEO32HWLT77MATHGXB',
'warc-date': '2021-03-03T14:41:28Z',
'warc-identified-content-language': 'eng',
'warc-record-id': '<urn:uuid:1333c910-7921-4bdd-9bb9-1a8322dfa74b>',
'warc-refers-to': '<urn:uuid:9e976ac2-74e4-4e30-8c49-12f2dc1c257c>',
'warc-target-uri': 'https://www.radiomarsho.com/a/27368811.html',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Апти Бисултанов вина 1959 шарахь. Апти -- гоьваьлла нохчийн '
'кхузаманахьлера байтанча ву. 1983 шарахь цо чекхъяккхира ...'}
```
#### deduplicated_ceb
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 8799,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:GSVQUFRLD3BYXEG2ASAEVHR2IH4D7A2S',
'warc-date': '2021-03-09T04:28:21Z',
'warc-identified-content-language': 'ceb,eng',
'warc-record-id': '<urn:uuid:e53f5344-29f5-4e59-8dac-8fdc92d1758f>',
'warc-refers-to': '<urn:uuid:03c0e7e5-b84c-4205-80cc-c3fb3dc82406>',
'warc-target-uri': 'https://www.safesworld.com/ceb/safewell-17ef-small-combination-lock-digital-safe-box-with-electronic-combination.html',
'warc-type': 'conversion'},
'nb_sentences': 4,
'offset': 0},
'text': '17EF SERYE Talagsaong design ug madanihon nga kolor naghimo 17EF '
'popular nga sa taliwala sa mga anak ug mga babaye, k...'}
```
#### deduplicated_ckb
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 8668,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:XZOIJPSX5QTL5QQPQMXEVADFHZTXMP5I',
'warc-date': '2021-03-09T03:25:59Z',
'warc-identified-content-language': 'kur,eng',
'warc-record-id': '<urn:uuid:9fe2f7e9-c158-4b84-a4a3-24e51acbd69e>',
'warc-refers-to': '<urn:uuid:14902cc0-948b-4dcf-bde6-e687ba41212f>',
'warc-target-uri': 'https://www.dastihawkary.org/blog/portfolio/social-harms-of-drugs/?lang=en',
'warc-type': 'conversion'},
'nb_sentences': 9,
'offset': 0},
'text': 'وەبیرم دێ\u200c لە كۆتایی هەشتاكانی سەدەی ڕابردوو دیاردەیەك هەبوو '
'لەنێو گەنجە لادەرەكانی شاری هەولێر و سەرشەقام هەڵدەستان ...'}
```
#### deduplicated_cs
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 17263,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:EJZ477E7PWMVVVM777MHB5DMDHVYEWK6',
'warc-date': '2021-03-05T11:28:42Z',
'warc-identified-content-language': 'ces',
'warc-record-id': '<urn:uuid:6fc03e7f-9768-4f26-89ce-84fa4732e3c0>',
'warc-refers-to': '<urn:uuid:d78128e5-f667-4461-9f0c-2263d75b74a1>',
'warc-target-uri': 'https://www.lidovky.cz/relax/dobra-chut/mak-a-svestky-vyzkousejte-makovec-podle-romana-pauluse.A150427_125913_dobra-chut_ape?recommendationId=00000000-0000-5000-8000-000000000000',
'warc-type': 'conversion'},
'nb_sentences': 12,
'offset': 0},
'text': 'Porno motor vyhledávání o nové sedlo masáž se svou. pro měkký sex '
'voda učitelka kočička videa stránky Starý pár sex n...'}
```
#### deduplicated_cv
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 4133,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:FKR5EKWIFACLGBIK6IKLHTHDNTEZNF3T',
'warc-date': '2021-03-03T14:25:27Z',
'warc-identified-content-language': 'rus',
'warc-record-id': '<urn:uuid:8140dbf0-2fb0-48d8-a834-c1b052bcc72d>',
'warc-refers-to': '<urn:uuid:cca433fe-6646-4ab7-b5da-f8e17821b43d>',
'warc-target-uri': 'http://chuv-krarm.3dn.ru/blog/vladimir_leontev_savna_masharam_emer_perle_purnar_i/2013-02-08-47',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Сайт авторĕ тата модераторĕ- Михайлов Алексей, Чăваш Республикин '
'Президенчĕн 2010,2012 çулсенчи стипендиачĕ, Сайт адм...'}
```
#### deduplicated_cy
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 1967,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:RNFNJNY7RHGXN5NPEVF2PYNNIWOTDAMJ',
'warc-date': '2021-03-09T03:48:16Z',
'warc-identified-content-language': 'cym,eng',
'warc-record-id': '<urn:uuid:66f063ba-6a33-4f53-9cfb-7dc64a292e89>',
'warc-refers-to': '<urn:uuid:281f9c10-2d7d-4781-82f6-a504f27852a1>',
'warc-target-uri': 'https://cy.wikipedia.org/wiki/John_T._Koch',
'warc-type': 'conversion'},
'nb_sentences': 2,
'offset': 0},
'text': 'Graddiodd o Brifysgol Harvard, gan gymeryd doethuriaeth mewn '
'Ieithoedd a Llenyddiaethau Celtaidd yn 1985. Bu hefyd yn...'}
```
#### deduplicated_da
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 22154,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:AF2FFBNZQ3TOEEZ3MFDU77CXZ6PVU3ZB',
'warc-date': '2021-03-01T12:49:13Z',
'warc-identified-content-language': 'dan',
'warc-record-id': '<urn:uuid:92fffabd-5d36-4539-b8eb-18a0f2554ddb>',
'warc-refers-to': '<urn:uuid:1970d6bb-474f-448b-a3e1-8a77c9a32cb6>',
'warc-target-uri': 'http://rosamundis.dk/thai-horsens-gode-parfumer-til-m%C3%A6nd/',
'warc-type': 'conversion'},
'nb_sentences': 16,
'offset': 0},
'text': 'Mange praler af den sindsro, de har fundet i huler i det '
'norske/forfaldne franske ferielejligheder etc., hvor de har ...'}
```
#### deduplicated_de
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 11180,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:LLCPCA3RGKMXLYUEA3OZ2KFEEBNEOPE2',
'warc-date': '2021-03-09T01:22:52Z',
'warc-identified-content-language': 'eng,deu',
'warc-record-id': '<urn:uuid:0128ab60-86c8-4dc2-b1cf-57950654ae38>',
'warc-refers-to': '<urn:uuid:ff27032b-b843-4ba3-b1e2-377793173071>',
'warc-target-uri': 'http://bioconcepts.de/views/search.php?term=231&listed=y',
'warc-type': 'conversion'},
'nb_sentences': 16,
'offset': 0},
'text': 'Kreismeisterschaften bringen zahlreiche Sunderner Medaillengewinner '
'und Titelträger - Tischtennis im Sauerland\n'
'Am ver...'}
```
#### deduplicated_diq
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 4196,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:DTA56M722SM5BZLNADOCPXQGGT32J46O',
'warc-date': '2021-03-06T15:51:03Z',
'warc-identified-content-language': 'tur,srp,nno',
'warc-record-id': '<urn:uuid:b7dcd4a4-b130-4009-88d0-631ca51a7bcc>',
'warc-refers-to': '<urn:uuid:fe4e4ad7-3089-40d2-aa29-f675e3cea0dd>',
'warc-target-uri': 'https://diq.wikipedia.org/wiki/Z%C4%B1wan%C3%AA_Slawki',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Zıwanê Slawki, zıwano merdumanê Slawano. Zıwanê Slawki yew lızgeyê '
'Zıwananê Hind u Ewropao. Keyeyê Zıwananê Slawki be...'}
```
#### deduplicated_dsb
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 20663,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:WWZOAFJJLJ4OHG2PTVLCMP664OR26XCR',
'warc-date': '2021-02-27T22:03:14Z',
'warc-identified-content-language': None,
'warc-record-id': '<urn:uuid:239b7155-8f37-4889-bad8-5bdb0aaa83c2>',
'warc-refers-to': '<urn:uuid:2714b744-a080-4807-a29a-d8f99c80e49c>',
'warc-target-uri': 'https://dsb.m.wikipedia.org/wiki/P%C5%9Bed%C5%82oga:LocMap',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Mjaz tamnjejšej pśedłogu a </noinclude>-kodom mógu pśidatne '
'kategorije a cuzorěcne wótkaze stojaś. Ewentualne pśikład...'}
```
#### deduplicated_dv
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 7923,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:ECFUNRNYICXFAZXP5TLM45DPGJX5AHOI',
'warc-date': '2021-02-24T19:53:40Z',
'warc-identified-content-language': 'div,eng',
'warc-record-id': '<urn:uuid:23e2557a-dacc-428c-99fc-e41d4ce2ed95>',
'warc-refers-to': '<urn:uuid:067b6719-0209-49df-8198-27b1954b61b4>',
'warc-target-uri': 'https://dhiislam.com/114288',
'warc-type': 'conversion'},
'nb_sentences': 7,
'offset': 0},
'text': 'މީސްތަކުންގެ ފިކުރާއި ކުޅެލުމަށްޓަކައި މިޒަމާނުގެ ވަސީލަތްތަކުގެ '
'ބޭނުން އެންމެ ރަނގަޅު ގޮތުގައި ހިފަމުންދޭ: ޝެއިޚް ފި...'}
```
#### deduplicated_el
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 12604,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:2LXNVVGR3C4G72RLJUJBKUWLZZJ53TPX',
'warc-date': '2021-03-03T11:34:34Z',
'warc-identified-content-language': 'ell,eng',
'warc-record-id': '<urn:uuid:d95ddbe8-2e54-4d61-a6af-227212090684>',
'warc-refers-to': '<urn:uuid:a0e15450-8455-4b2f-ad8f-3858873a538d>',
'warc-target-uri': 'https://www.androsportal.gr/category/topika/nea-syllogwn/',
'warc-type': 'conversion'},
'nb_sentences': 18,
'offset': 0},
'text': 'Η ραδιοφωνική διαφήμιση χαρακτηρίζεται από αμεσότητα και οικειότητα '
'λόγω της στενής σχέσης του μέσου με τους ακροατές...'}
```
#### deduplicated_eml
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 11710,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:OM2W34UTSIJJHAEXEX42BYMZWBB7U3FS',
'warc-date': '2021-03-05T23:48:29Z',
'warc-identified-content-language': 'ita',
'warc-record-id': '<urn:uuid:26a267af-a6de-4e84-b945-411b78b4815a>',
'warc-refers-to': '<urn:uuid:656aaba2-ff1d-4d7c-915a-9a555533aa42>',
'warc-target-uri': 'https://eml.wikipedia.org/wiki/2_(n%C3%B9mer)',
'warc-type': 'conversion'},
'nb_sentences': 2,
'offset': 0},
'text': "Al 2 'l è al prim nùmer prim ed tùta la séri ch'a s cata in di "
"nùmer naturèl e anc 'l ùnic ch'al sìa pèra:\n"
"Insèm a 'l..."}
```
#### deduplicated_en
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 15201,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:EIQTEGOE4V5SDID2OLTO4PWWCTW3AD5H',
'warc-date': '2021-03-03T18:20:30Z',
'warc-identified-content-language': 'eng',
'warc-record-id': '<urn:uuid:7cec445b-76fe-4ce2-ab43-8a85de680c6f>',
'warc-refers-to': '<urn:uuid:1cf845b2-3015-4f01-abaf-262af4adeba5>',
'warc-target-uri': 'https://www.aqueencitysound.com/2016/05',
'warc-type': 'conversion'},
'nb_sentences': 28,
'offset': 0},
'text': 'But the term “extension” also means lengthening. EkhartYoga members '
'can get to k… Renforcement du dos (muscles para-v...'}
```
#### deduplicated_eo
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 27953,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:YO4NP6746IFQDF5KISEPLNFA2QD3PTEO',
'warc-date': '2021-03-09T05:29:46Z',
'warc-identified-content-language': 'epo,eng',
'warc-record-id': '<urn:uuid:5e3bc7b3-723f-4de9-8202-790351a2253f>',
'warc-refers-to': '<urn:uuid:dd5e537a-f340-4418-bc07-487232ea197c>',
'warc-target-uri': 'http://kantaro.ikso.net/cxu?image=kis_kut.png&ns=&tab_details=view&do=media',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Iloj Montri paĝonMalnovaj reviziojRetroligoj Freŝaj '
'ŝanĝojMedio-administriloIndekso RegistriĝiEnsaluti'}
```
#### deduplicated_es
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 8322,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:DXIQKIWES4PP64BTGK5BYTJ3TX4RVQSI',
'warc-date': '2021-03-03T23:27:45Z',
'warc-identified-content-language': 'spa,eng',
'warc-record-id': '<urn:uuid:4275a14a-f997-4e58-8cf6-046006d76dab>',
'warc-refers-to': '<urn:uuid:d54d1a7b-1316-4bd1-8147-7a44ec5b3803>',
'warc-target-uri': 'https://www.rcrperu.com/defensoria-del-pueblo-oficina-en-lima-sur-registro-mas-de-3000-casos-durante-el-2020/',
'warc-type': 'conversion'},
'nb_sentences': 7,
'offset': 0},
'text': 'Se prevé que a finales de mes haya llegado al 92,5 por ciento de '
'los centros, aquellos en los que no hay confirmados ...'}
```
#### deduplicated_et
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 57234,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:JU7SWP3ZS36M3ABAEPNTFH37MVI2SLAF',
'warc-date': '2021-02-24T20:43:43Z',
'warc-identified-content-language': 'est',
'warc-record-id': '<urn:uuid:2bbcaa39-7336-4ade-accf-1b582785f731>',
'warc-refers-to': '<urn:uuid:849563c9-8549-4bdc-a09c-d179c8399ae0>',
'warc-target-uri': 'https://cardiaccareclinic.com/chto-luchshe-panangin-ili-kardiomagnil.html',
'warc-type': 'conversion'},
'nb_sentences': 129,
'offset': 0},
'text': 'Kas hirmu ei pruugi tekitada hoopis segadus? Näiteks võtame Ukraina '
'kogemuse. Järsku ilmusid välja lindikestega mehed...'}
```
#### deduplicated_eu
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 4248,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:STDEJOH35DPN5UB52OUZJJC4YCN7EH3N',
'warc-date': '2021-03-09T05:11:48Z',
'warc-identified-content-language': 'spa,eus',
'warc-record-id': '<urn:uuid:fb6752f7-5e91-4d0c-b022-71bd5d3ce910>',
'warc-refers-to': '<urn:uuid:faca7a42-20c2-4c4c-bd8a-6d4be5a1adb6>',
'warc-target-uri': 'http://intermedia.eus/la-comunicacion-imprescindible-lo-que-no-debemos-olvidar-de-2015-resumido-en-447/',
'warc-type': 'conversion'},
'nb_sentences': 2,
'offset': 0},
'text': 'Nesken artean bokazio zientifikoak eta teknologikoak sustatzeko '
'INSPIRA STEAM proiektua ia 120 ikastetxetako 5.000 ik...'}
```
#### deduplicated_fa
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 10411,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:VM7Q7TXNMU2SRNHFJSZMBCKU2YVRKI56',
'warc-date': '2021-03-02T11:23:27Z',
'warc-identified-content-language': 'fas',
'warc-record-id': '<urn:uuid:9f666d03-9592-4f59-9111-981a558b3a32>',
'warc-refers-to': '<urn:uuid:8daf3dc1-92dd-4dbf-a339-992c99f09112>',
'warc-target-uri': 'https://zhycan.com/concough/blog/%D9%86%D8%AD%D9%88%D9%87-%D8%AB%D8%A8%D8%AA-%D9%86%D8%A7%D9%85-%DA%A9%D9%86%DA%A9%D9%88%D8%B1-%D8%AF%DA%A9%D8%AA%D8%B1%DB%8C-97-%D8%A7%D8%B9%D9%84%D8%A7%D9%85-%D8%B4%D8%AF-%D8%A7%D9%85/',
'warc-type': 'conversion'},
'nb_sentences': 16,
'offset': 0},
'text': 'انجمن دانشجویان پیام نور تبليغات تماس با ما تبلیغات دسته بندی باز / '
'بسته کردن دسته بندی ها . شرایط اختصاصی برای شغل د...'}
```
#### deduplicated_fi
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 19216,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:5OUEZDSL7KB2VHT2R67YZDER6UO5FHON',
'warc-date': '2021-03-05T00:14:23Z',
'warc-identified-content-language': 'fin,eng',
'warc-record-id': '<urn:uuid:61e0fc42-ceee-4026-ba76-3c8a8addd596>',
'warc-refers-to': '<urn:uuid:c4ba3c9f-5a6c-4de5-8f77-f5beb547315c>',
'warc-target-uri': 'https://kreditassms.eu/arvostelut-treffisivusto-py%C3%B6re%C3%A4-tanssi/',
'warc-type': 'conversion'},
'nb_sentences': 46,
'offset': 0},
'text': 'Facebook ulkomaiset morsiamet fantasia lähellä lohja mistä pillua '
'porno leffat sex treffit karvaiset tussut Thai mass...'}
```
#### deduplicated_fr
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 5274,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:XUVXOZU2BIT4TIDEVHLLBLUIHRS4L7WV',
'warc-date': '2021-03-03T14:00:24Z',
'warc-identified-content-language': 'fra,eng',
'warc-record-id': '<urn:uuid:76252d00-9672-479c-9580-722614e078f9>',
'warc-refers-to': '<urn:uuid:4a6bde1e-9596-4388-9334-cc473a7c93ee>',
'warc-target-uri': 'https://www.cahier-des-charges.net/produit/modele-cahier-des-charges-de-logiciel-de-gestion-de-processus-metier/',
'warc-type': 'conversion'},
'nb_sentences': 9,
'offset': 0},
'text': 'Créée en 1765 par le duc de Villars, alors gouverneur de Provence, '
'l’École supérieure d’art d’Aix en Provence est un ...'}
```
#### deduplicated_frr
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 27381,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:DJE2KO4YWWRERKS5JYSK5JCJWYZ6DJHM',
'warc-date': '2021-03-01T03:40:10Z',
'warc-identified-content-language': 'ell',
'warc-record-id': '<urn:uuid:3a2a34ae-1c42-4d2e-bb08-8dabc916ea30>',
'warc-refers-to': '<urn:uuid:caeb39b2-da76-463d-b80c-4917d3dca230>',
'warc-target-uri': 'https://www.sedik.gr/neo/el/%CE%B1%CF%81%CF%87%CE%B5%CE%AF%CE%BF-%CE%B5%CE%BB%CE%B1%CE%B9%CE%BF%CE%BD%CE%AD%CF%89%CE%BD/%CE%B1%CF%81%CF%87%CE%B5%CE%AF%CE%BF-%CE%B5%CE%BB%CE%B1%CE%B9%CE%BF%CE%BD%CE%AD%CF%89%CE%BD-2009/178-178-title',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': '’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ '
'’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’ ’...'}
```
#### deduplicated_fy
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 1807,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:JABSHFJ2L6SQOXPPTBYGZGR24GCEDTTM',
'warc-date': '2021-03-09T04:24:30Z',
'warc-identified-content-language': 'fry',
'warc-record-id': '<urn:uuid:fd1b28cb-20ce-4082-b1ca-40045ed6af73>',
'warc-refers-to': '<urn:uuid:bc50e1f0-6384-4054-8916-2a489e9a0ffd>',
'warc-target-uri': 'https://www.omropfryslan.nl/nijs/201805-gruttere-lisboksstal-tastien',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Melkfeehâlders yn Súdwest-Fryslân kinne tenei makliker '
"lisboksstâlen fergrutsje no't de gemeente de lanlike wet op st..."}
```
#### deduplicated_ga
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 3296,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:WF6SCFDXN3NOT7FPKTEFOAMMPKXSEZ2W',
'warc-date': '2021-03-09T04:37:11Z',
'warc-identified-content-language': 'gle',
'warc-record-id': '<urn:uuid:bff39289-dbf7-444c-8df1-382fd46c993d>',
'warc-refers-to': '<urn:uuid:e27ba1c5-5707-4e9f-8ba8-f42c67bd9fc9>',
'warc-target-uri': 'http://nos.ie/cultur/iarratais-a-lorg-don-slam-filiochta-agus-duaischiste-700-ann-i-mbliana/',
'warc-type': 'conversion'},
'nb_sentences': 6,
'offset': 0},
'text': 'Tá duaischiste £700 ar fáil do Slam Filíochta Liú Lúnasa a bheidh '
'ar siúl ar líne ag deireadh na míosa seo chugainn. ...'}
```
#### deduplicated_gd
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 7659,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:OO363HOO6EDDYSBTTYB6H4WYAJBBMJ6D',
'warc-date': '2021-03-03T15:22:11Z',
'warc-identified-content-language': 'gla',
'warc-record-id': '<urn:uuid:e24cc86f-ae2c-49f6-b668-cda4f514a34d>',
'warc-refers-to': '<urn:uuid:1739d2d8-974d-4c29-b8d0-3a3ef9082537>',
'warc-target-uri': 'http://gd.cnswmc.com/ty320-3-bulldozer-product/',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Tha inneal-brathaidh TY320-3 crochte leth-chruaidh, gluasad '
'uisgeachaidh, inneal tarbh fo smachd seòrsa hydraulic. Ta...'}
```
#### deduplicated_gl
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 4202,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:TIH7ARF4FNLH7VRGHXKOWVHNXNXC2HZX',
'warc-date': '2021-03-09T04:47:46Z',
'warc-identified-content-language': 'glg',
'warc-record-id': '<urn:uuid:983dd790-0846-4232-a7b4-3956af0982a8>',
'warc-refers-to': '<urn:uuid:b77207af-29d0-459f-9a55-0b25501d3e8b>',
'warc-target-uri': 'http://concellomuxia.com/item/outras-capelas/',
'warc-type': 'conversion'},
'nb_sentences': 8,
'offset': 0},
'text': 'O templo actual é producto de diversas reconstrucións que se '
'realizaron a finais do século XVII e principios do XVIII...'}
```
#### deduplicated_gn
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 3873,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:FWN62CTWNJKPWUARS4BMBUFU6OVHL6XP',
'warc-date': '2021-02-27T22:49:49Z',
'warc-identified-content-language': 'grn,eng,bih',
'warc-record-id': '<urn:uuid:b4954ced-abe0-487e-b5b0-a26beb751a02>',
'warc-refers-to': '<urn:uuid:be5468f1-47f0-4bd8-a177-3529a14dead7>',
'warc-target-uri': 'https://gn.wikipedia.org/wiki/Apere%27arusu',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Ko ñe\'ẽ "apere\'arusu" ou avañe\'ẽ ñe\'ẽngue "apere\'a" he\'ise '
'India Tapiti, ha avañe\'ẽ ñe\'ẽngue "rusu" he\'iséva iguasúva.'}
```
#### deduplicated_gom
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 8747,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:CKNSFAH2KISLLR7222FSQSPENYHQTAX3',
'warc-date': '2021-03-01T11:10:29Z',
'warc-identified-content-language': 'mar',
'warc-record-id': '<urn:uuid:d4622a3e-1b0e-4775-b25d-273ee14ae176>',
'warc-refers-to': '<urn:uuid:9d00e57b-9031-4f86-a9c8-cc3c0c2213a7>',
'warc-target-uri': 'https://gom.m.wikipedia.org/wiki/%E0%A4%B5%E0%A5%80%E0%A4%9C',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'कांय वस्तू रगडल्यो तर तांचेकडेन हलक्यो वस्तू आकर्शित जाता हेंजेन्ना '
'पळयलें तेन्ना वीज हे ऊर्जेची कल्पना मनशाक आयली.हे...'}
```
#### deduplicated_gu
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 15036,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:2FGV42SN72HRKRBEEQ7QJVJBLUYQPCIH',
'warc-date': '2021-03-09T04:48:08Z',
'warc-identified-content-language': 'eng,khm,lao',
'warc-record-id': '<urn:uuid:04d772d6-09db-4d5a-86c8-22b914a35b6f>',
'warc-refers-to': '<urn:uuid:f3cdcafa-5a28-4fbb-81df-7cc5e7bb3248>',
'warc-target-uri': 'http://www.ahealthyme.com/RelatedItems/RelatedDocuments.pg?d=&TypeId=121&ContentId=761&Category=DC',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'ધ્યાન આપો: જો તમે ગુજરા તી બોલતા હો, તો તમને ભા ષા કીય સહાય તા સેવા '
'ઓ વિ ના મૂલ્યે ઉપલબ્ધ છે. તમા રા આઈડી કાર ્ડ પર આ...'}
```
#### deduplicated_gv
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 29707,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:TIDW47D4MAHOLY6PQZ5SHLDYQIJ66REQ',
'warc-date': '2021-03-06T18:16:22Z',
'warc-identified-content-language': 'glv,eng',
'warc-record-id': '<urn:uuid:c7a5e531-487b-4e52-96ca-33b658691652>',
'warc-refers-to': '<urn:uuid:fa7285d4-126c-458f-9a72-d0d8615ce494>',
'warc-target-uri': 'https://gv.wikipedia.org/wiki/%C3%87hengoaylleeaght',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Ta çhengoaylleeaght feamagh eiyrt er sheiltynyssyn çhengoaylleeagh '
'ayns ayrnyn myr ynsaghey çhengaghyn joaree, glare-...'}
```
#### deduplicated_he
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 12254,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:BL56ZUXYO5GLIO6YTBUWKPVYJN2BKCIM',
'warc-date': '2021-03-09T10:29:09Z',
'warc-identified-content-language': 'heb,eng',
'warc-record-id': '<urn:uuid:1ae77825-a836-424e-a8b1-1f9c985a41b9>',
'warc-refers-to': '<urn:uuid:fce3d3dc-979e-4603-82e3-027b75346e52>',
'warc-target-uri': 'https://shop.makeup.land/collections/frontpage',
'warc-type': 'conversion'},
'nb_sentences': 2,
'offset': 0},
'text': 'הולדת פג היא אירוע מטלטל לכל משפחה, אך הולדת פג בצל מגפת הקורונה '
'מאתגרת אף יותר? מהם האתגרים עמם מתמודדים ההורים והצו...'}
```
#### deduplicated_hi
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 7897,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:VZCN5HXN57VQHZJT5G3NWV7RCIT4GP7T',
'warc-date': '2021-02-26T10:18:11Z',
'warc-identified-content-language': 'hin,eng',
'warc-record-id': '<urn:uuid:6cccccb7-be0e-4c16-83be-7b4150b107ac>',
'warc-refers-to': '<urn:uuid:41eda5d1-e2cf-44f4-9f5b-c074a2de89da>',
'warc-target-uri': 'https://36.gurturgoth.com/2019/11/blog-post_8.html',
'warc-type': 'conversion'},
'nb_sentences': 5,
'offset': 0},
'text': 'Bill Gates Biography in Hindi, विश्व के सबसे अमीर इंसान और '
'माइक्रोसॉफ्ट कंपनी के संस्थापक Bill Gates जिसने अपनी बुद्ध...'}
```
#### deduplicated_hr
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 41545,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:6NTZEPK7ETF4AOLM3YDZRLRGZAKH7XM3',
'warc-date': '2021-03-09T04:58:04Z',
'warc-identified-content-language': 'hrv,bos,eng',
'warc-record-id': '<urn:uuid:32361cc9-e12a-4861-978a-b94b84efe78c>',
'warc-refers-to': '<urn:uuid:f0476e5f-e04c-4741-94a6-ddbcfb25c17e>',
'warc-target-uri': 'http://mjesec.ffzg.hr/webpac/?rm=results&show_full=1&f=PersonalName&v=Sanader%20Mirjana',
'warc-type': 'conversion'},
'nb_sentences': 3,
'offset': 0},
'text': 'Impresum: Pula : Sveučilište u Zagrebu, Međunarodno središte '
'hrvatskih sveučilišta u Istri, Međunarodni istraživački ...'}
```
#### deduplicated_hsb
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 3352,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:E5ZCT5OIZBDV2EFBNX3MSLFJKKMZWQWI',
'warc-date': '2021-03-08T22:15:50Z',
'warc-identified-content-language': None,
'warc-record-id': '<urn:uuid:374a31b4-d38f-4d94-b3df-59013b15e644>',
'warc-refers-to': '<urn:uuid:fa9b7b26-2b4c-4acc-a652-47047617b0c0>',
'warc-target-uri': 'https://www.serbske-nowiny.de/index.php/hsb/z-luzicy/lokalka/item/50643-jednotna-proty-ka-tr-bna',
'warc-type': 'conversion'},
'nb_sentences': 2,
'offset': 0},
'text': 'Žonjace akciske tydźenje zahajene\tDźensniši Mjezynarodny dźeń '
'žonow je zazběh hač do 22. apryla trajacych ...\t\n'
'Wotstr...'}
```
#### deduplicated_ht
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 17823,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:LXQEYMTPIKHPAYKEKIZF6FCMC6WH66PW',
'warc-date': '2021-02-25T02:48:22Z',
'warc-identified-content-language': 'rus',
'warc-record-id': '<urn:uuid:a5599306-82ad-4740-9c00-5bba34c96d54>',
'warc-refers-to': '<urn:uuid:2378d2f7-69a4-4f8a-ad03-4d556d031ebb>',
'warc-target-uri': 'http://mywebstores.ru/index.php?id_product=1841&controller=product',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'начать us $ nan us $ nan us $ nan us $ nan us $ nan us $ nan us $ '
'nan us $ nan us $ nan us $ nan us $ nan us $ nan us...'}
```
#### deduplicated_hu
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 39801,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:B3XHZ4C4AJYQLVV3ESGOVZU6FZ5N5637',
'warc-date': '2021-02-26T07:03:18Z',
'warc-identified-content-language': 'hun',
'warc-record-id': '<urn:uuid:926ed467-3adb-44f5-b33c-63112879ba5a>',
'warc-refers-to': '<urn:uuid:9d9175b4-6b0a-45e8-961b-61e9d50eb684>',
'warc-target-uri': 'https://luminanz.eu/anya-hatartalan-ingyen-videok-pina-nagy-video-video-sex-szekx-hd-videa-nyelvu-%C3%B6reg/',
'warc-type': 'conversion'},
'nb_sentences': 104,
'offset': 0},
'text': 'A WordPress egy ingyenesen letölthető rendszer. Letöltés után csak '
'telepíteni kell a webszerverre és máris használhat...'}
```
#### deduplicated_hy
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 6269,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:42PWBXN2Q7PFCRFWIDLTW42KUUGAKQOE',
'warc-date': '2021-02-24T23:49:31Z',
'warc-identified-content-language': 'hye,eng',
'warc-record-id': '<urn:uuid:932d1903-aea7-4be9-abb4-6b3114592c9c>',
'warc-refers-to': '<urn:uuid:cecf676f-884a-4311-a0b5-45ade0f517b7>',
'warc-target-uri': 'https://www.usanogh.am/lur/tramp-amn-coronavirus/',
'warc-type': 'conversion'},
'nb_sentences': 4,
'offset': 0},
'text': 'ՀՀ ԳԱԱ Զեկույցներ =Reports NAS RA կիրառում է «Ստեղծագործական '
'համայնքներ» հեղինակային իրավունքի արտոնագիրը համաձայն որ...'}
```
#### deduplicated_ia
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 9479,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:4JBN4SUDHHRPZI3TAVTZ4JUYSSOGGRFX',
'warc-date': '2021-03-01T17:14:58Z',
'warc-identified-content-language': 'ron,eng',
'warc-record-id': '<urn:uuid:5abe05ff-7309-4c3f-8ccd-175a12a655a2>',
'warc-refers-to': '<urn:uuid:8dec50fd-2be1-4bcf-8bb2-8cb9826c2465>',
'warc-target-uri': 'https://www.monitorulsv.ro/Ultima-ora-local/2008-02-18/Campania-electorala-interzisa-in-Primaria-Suceava',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha '
'ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ha ...'}
```
#### deduplicated_id
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 3080,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:XU6GIUNYT5ELGH5XSZ4FUARC3YTJAD5P',
'warc-date': '2021-03-05T03:32:56Z',
'warc-identified-content-language': 'ind',
'warc-record-id': '<urn:uuid:2328da88-ee5f-4b4c-af3e-25dc4a574041>',
'warc-refers-to': '<urn:uuid:0781f7e2-f020-402b-b204-71fdf299f956>',
'warc-target-uri': 'https://sulsel.kemenag.go.id/berita/berita-kontributor/stqh-26-tingkat-kabupaten-jeneponto-siap-di-gelar',
'warc-type': 'conversion'},
'nb_sentences': 2,
'offset': 0},
'text': '* Masa berlaku normal poin 1 (satu) tahun dan masa berlaku bonus '
'poin sampai dengan 31 Desember 2020.\n'
'Diskon dari Ban...'}
```
#### deduplicated_ie
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 16919,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:W7UDGWMCEYQFEIPJMFZKX72Z6MH4XCUP',
'warc-date': '2021-03-08T16:16:42Z',
'warc-identified-content-language': 'ron,eng',
'warc-record-id': '<urn:uuid:f5ba5473-8eb2-41f4-9e43-3d36f14243a1>',
'warc-refers-to': '<urn:uuid:d2784efa-8250-4370-a348-28c640195663>',
'warc-target-uri': 'https://rolabel.info/door/yX-WpseZpNycfXY/luis-gabriel-haziran-te-am-cautat-si-te-am-gasit-official-video.html',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Va iubesc mult mult mult mult mult mult mult mult mult mult mult '
'mult mult mult mult mult mult mult mult mult mult mu...'}
```
#### deduplicated_ilo
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 3511,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:NLHH2LVPZTUZE37ET2FJIRZNOLPLKK4O',
'warc-date': '2021-03-03T15:52:32Z',
'warc-identified-content-language': 'tgl',
'warc-record-id': '<urn:uuid:2fb6a437-41c8-4c2c-9f5d-2e8c34df9f2b>',
'warc-refers-to': '<urn:uuid:bdc072a0-db63-4256-a96b-7515a2c4fdfd>',
'warc-target-uri': 'https://ilo.m.wikipedia.org/wiki/Amphibia',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Daytoy nga artikulo dagiti nangruna nga artikulo ket pungol. '
'Makatulongka iti Wikipedia babaen ti panagnayon iti daytoy.'}
```
#### deduplicated_io
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 3586,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:VUQPETM2PUWBL5AGADEVN2FPE7KURXG4',
'warc-date': '2021-03-03T15:22:41Z',
'warc-identified-content-language': 'ara',
'warc-record-id': '<urn:uuid:fd8a899b-d54a-424d-9955-a90b81e16439>',
'warc-refers-to': '<urn:uuid:c40226a6-6851-4009-a834-77a1a3e0c0f3>',
'warc-target-uri': 'https://io.wikipedia.org/wiki/New_Vienna,_Iowa',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': "Segun l'Usana Kontado Ministerio, l'urbo havas entote 1.2 km², "
'equivalanta a 0.4 mi², di qui 1.2 km² (0.4 mi²) esas l...'}
```
#### deduplicated_is
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 1829,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:DXUGRT4OK7WRCOPGB7AAKLHPUDTBDRO2',
'warc-date': '2021-03-09T04:40:07Z',
'warc-identified-content-language': 'isl',
'warc-record-id': '<urn:uuid:6568bf31-b402-45b8-9ddb-6ce0f3d0a323>',
'warc-refers-to': '<urn:uuid:5daa12c0-604a-4233-9ed8-d4e245af4048>',
'warc-target-uri': 'http://hugvis.hi.is/',
'warc-type': 'conversion'},
'nb_sentences': 2,
'offset': 0},
'text': 'Vegna hertra aðgerða í bará ttunni við Covid19 munum við takmarka '
'gestafjölda í laugum okkar við 80 manns. Thank you ...'}
```
#### deduplicated_it
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 14112,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:MLJ4TW2HJZAPE2ORVARPJES6GRGO6ZLK',
'warc-date': '2021-03-05T13:56:32Z',
'warc-identified-content-language': 'ita',
'warc-record-id': '<urn:uuid:31d7ebb5-c1f7-468b-92f8-b79b7c28af9f>',
'warc-refers-to': '<urn:uuid:f92f33a2-6940-49fd-a21e-228ee5d2efb1>',
'warc-target-uri': 'https://mauriziomezzetti.com/patologie-trattate/',
'warc-type': 'conversion'},
'nb_sentences': 47,
'offset': 0},
'text': 'Il Presidente del Caffè Letterario Quasimodo di Modica, Domenico '
'Pisana, sarà ospite a Taranto, il prossimo 4 maggio,...'}
```
#### deduplicated_ja
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 16411,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:XOFBBBX7LINQS3EZN5VH6OQ7PPFNRICJ',
'warc-date': '2021-03-09T01:09:27Z',
'warc-identified-content-language': 'jpn,eng,lat',
'warc-record-id': '<urn:uuid:5c0685f4-736d-4155-9153-56cf79462df4>',
'warc-refers-to': '<urn:uuid:88586e1b-926d-4291-910f-53680e3d6482>',
'warc-target-uri': 'http://flpj.karapyzi.ru/30',
'warc-type': 'conversion'},
'nb_sentences': 14,
'offset': 0},
'text': '番組『日本を元気に!スマイルサプライズ!』が、28日に放送(後7:00)。コロナ禍や自然災害など、日本が長いトンネルに入ってしまったような状態だが、「でも、きっとこの先に明るい出口がある!」と明るい未...\n'
'プリゲーム『ポケモンスマイ...'}
```
#### deduplicated_jbo
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 6970,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:2EVVU2OCTSB5EYCHSV6Z7I3PMQSNNOED',
'warc-date': '2021-03-03T23:28:54Z',
'warc-identified-content-language': None,
'warc-record-id': '<urn:uuid:0d4387a2-391d-4e3e-8772-808face0ab78>',
'warc-refers-to': '<urn:uuid:4e45af2a-aea7-4f1a-af89-6ee5f69b7bfd>',
'warc-target-uri': 'https://jbo.m.wikipedia.org/wiki/mumyma%27i_7moi',
'warc-type': 'conversion'},
'nb_sentences': 26,
'offset': 0},
'text': "ni'o 7 la mumast. cu 7moi djedi fi'o masti la mumast. noi ke'a cu "
'mumoi masti .i 6 la mumast. cu purlamdei .ije 8 la ...'}
```
#### deduplicated_jv
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 8822,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:NPQGATEVIAYLOSLDB22EB7IYDVBZ7N6Q',
'warc-date': '2021-03-09T11:14:25Z',
'warc-identified-content-language': 'jav',
'warc-record-id': '<urn:uuid:db7d8bd7-a3a3-4a30-8786-7efb2352285d>',
'warc-refers-to': '<urn:uuid:2cb85a37-545e-471a-b7e7-cb334112f0e3>',
'warc-target-uri': 'https://jv.wikipedia.org/wiki/Bon%C3%A9kah',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Yèn sadurungé golèkan digawé kanggo awaké dhéwé, wiwit jaman iki '
'dikomersialakaké. Fungsiné owah saka ritual lan mode...'}
```
#### deduplicated_ka
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 42480,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:HHSMTLZXKA4SQDPDBWAOUFELXBUJZJKO',
'warc-date': '2021-03-06T15:33:35Z',
'warc-identified-content-language': 'kat,eng',
'warc-record-id': '<urn:uuid:7d931f2a-a6ef-4070-9277-2033e7e96b9b>',
'warc-refers-to': '<urn:uuid:89429497-9722-45e6-95a6-699ef7280e6c>',
'warc-target-uri': 'https://ka.m.wikipedia.org/wiki/%E1%83%93%E1%83%90%E1%83%A1%E1%83%A2%E1%83%98%E1%83%9C_%E1%83%B0%E1%83%9D%E1%83%A4%E1%83%9B%E1%83%90%E1%83%9C%E1%83%98',
'warc-type': 'conversion'},
'nb_sentences': 36,
'offset': 0},
'text': 'დასტინ ჰოფმანი[1] (ინგლ. Dustin Lee Hoffman დ. 8 აგვისტო, 1937) — '
'ორგზის კინოაკადემიის ოსკარისა და ექვსგზის ოქროს გლო...'}
```
#### deduplicated_kk
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 9197,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:BJW4PLV2UOAJLJO6E55YH7DAEWQTFQUZ',
'warc-date': '2021-03-09T04:35:14Z',
'warc-identified-content-language': 'rus,kaz',
'warc-record-id': '<urn:uuid:ddd1d3e1-3bf3-4c4a-b722-8e293ab16f75>',
'warc-refers-to': '<urn:uuid:097c4f10-4bdc-400d-ab39-c04e4f98f51f>',
'warc-target-uri': 'http://blogs.kazakh.ru/blogs/index.php?page=group&gid=6&id=3&PAGEN_1=3%3Fid%3D2?id=6',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Бұрынғы жоғары лауазымды шенеунік Анатолий Шкарупа (сол жақта) '
'өзіне қарсы қозғалған қылмыстық іс бойынша өтіп жатқан...'}
```
#### deduplicated_km
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 15036,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:2FGV42SN72HRKRBEEQ7QJVJBLUYQPCIH',
'warc-date': '2021-03-09T04:48:08Z',
'warc-identified-content-language': 'eng,khm,lao',
'warc-record-id': '<urn:uuid:04d772d6-09db-4d5a-86c8-22b914a35b6f>',
'warc-refers-to': '<urn:uuid:f3cdcafa-5a28-4fbb-81df-7cc5e7bb3248>',
'warc-target-uri': 'http://www.ahealthyme.com/RelatedItems/RelatedDocuments.pg?d=&TypeId=121&ContentId=761&Category=DC',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'ការជូនដំណឹង៖ ប្រសិនប. ើអ្នកនិយាយភាសា ខ្មែរ សេ វាជំនួយភាសាឥតគិតថ្លៃ '
'គឺអាចរកបានសម្ រាប ់អ្នក។ សូមទូរស័ព្ទទ ៅផ ្នែ កសេ វ...'}
```
#### deduplicated_kn
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 8425,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:TMWGSQVJMRPZCPMDM5D3AK2YKGMWBZZI',
'warc-date': '2021-03-09T04:21:39Z',
'warc-identified-content-language': 'kan,eng',
'warc-record-id': '<urn:uuid:ca35da96-ee3a-43ad-8082-a10b055200ca>',
'warc-refers-to': '<urn:uuid:a57cc8f6-c5ed-47a2-9322-2259687cdbde>',
'warc-target-uri': 'https://kannada.b4blaze.com/tag/rachitha-ram/',
'warc-type': 'conversion'},
'nb_sentences': 16,
'offset': 0},
'text': 'ಅಡಿಗರು ಮತ್ತು ರಾಯರು ಚಾಪೆ ಹಾಸಿ ಸ್ವಲ್ಪ ಹೊತ್ತು ಮಲಗಿ ಕಾಫಿ ಕುಡಿದು '
'ಹೊರಟುಹೋದಿದ್ದರು. ಜಾತ್ರೆ ದಿನ ಜಗನ್ನಾಥನ ಮನೆಗೆ ಬರಬಹುದಾದ ನೂರಾರು...'}
```
#### deduplicated_ko
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 2831,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:DLTUACNWU3R5KYI7HMMZF4CYR4WGRMWU',
'warc-date': '2021-02-26T10:13:10Z',
'warc-identified-content-language': 'kor,eng',
'warc-record-id': '<urn:uuid:7f7727bf-bf3d-45c3-8e3c-b595f67f9d90>',
'warc-refers-to': '<urn:uuid:17735508-d2ce-4e0a-a3ba-86acb749b9a2>',
'warc-target-uri': 'http://excel2017.zz.am/entry/mousqul',
'warc-type': 'conversion'},
'nb_sentences': 3,
'offset': 0},
'text': '인류는 최근 수백년 동안 물질적 풍요를 행복의 최대 조건으로 믿고, 이를 추구해 왔다. 그러나 이 과정에서 사람들은 '
'상대방에게 사랑을 베풀기보다는 상처를 입히는 일이 많아졌고, 물질적 풍요는 내면의 충족을 동반...'}
```
#### deduplicated_krc
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 4806,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:CWWWGTU7JCHS7SR5A7D7QMDTF4JBMCA6',
'warc-date': '2021-02-26T04:08:10Z',
'warc-identified-content-language': 'nno,bih',
'warc-record-id': '<urn:uuid:ef2175c0-4887-4006-9b21-374282abf2d2>',
'warc-refers-to': '<urn:uuid:d5aaef09-6f3c-427a-8c2f-664e639c2a0f>',
'warc-target-uri': 'https://krc.wikipedia.org/wiki/1606_%D0%B4%D0%B6%D1%8B%D0%BB',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Бу, тамамланмагъан статьяды. Сиз болушургъа боллукъсуз проектге, '
'тюзетиб эм информация къошуб бу статьягъа.'}
```
#### deduplicated_ku
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 12767,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:BQQEDD5HKU6LXDRIDLMWPIESOMEGIUX6',
'warc-date': '2021-03-09T04:11:10Z',
'warc-identified-content-language': 'eng',
'warc-record-id': '<urn:uuid:5a67e5e4-f688-4aa1-a9a0-2e4f6217ef21>',
'warc-refers-to': '<urn:uuid:40fa61be-18d1-4bd5-9267-252720cd5b05>',
'warc-target-uri': 'http://www.peyamakurd.org/kurmanci/Kurdistan/gruben-smo-ye-bi-hawane-li-til-rifete-xistin-3-miri-u-6-birindar',
'warc-type': 'conversion'},
'nb_sentences': 2,
'offset': 0},
'text': 'PeyamaKurd – Grûbên bi ser Tirkiyê de li Binxetê li bajarokê Til '
'Rifetê bi hawanê lê dan û di encamê de 3 kes mirin û...'}
```
#### deduplicated_kv
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 14161,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:JH3R64H4VMXQ3NRHTX3LO3B4VFN6IZ62',
'warc-date': '2021-03-03T15:09:36Z',
'warc-identified-content-language': 'rus',
'warc-record-id': '<urn:uuid:a94b390c-8e72-475d-bf76-c523c20908ce>',
'warc-refers-to': '<urn:uuid:e11eee46-e68f-4e1b-b4a3-0b9eeb74a877>',
'warc-target-uri': 'https://kv.wikipedia.org/wiki/%D0%9C%D0%B8%D0%BA%D1%83%D1%88%D0%B5%D0%B2_%D0%90%D0%BD%D0%B0%D1%82%D0%BE%D0%BB%D0%B8%D0%B9_%D0%9A%D0%BE%D0%BD%D1%81%D1%82%D0%B0%D0%BD%D1%82%D0%B8%D0%BD%D0%BE%D0%B2%D0%B8%D1%87',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': '1947, моз тӧлысь–1950, кӧч тӧлысь – уджалiс велöдысьöн да '
'директорöн Сыктывдiн районса Ыб шöр школаын.'}
```
#### deduplicated_kw
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 3496,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:S5H4MWHD4QTG74ZNJZ5X63W2XSLUJU7C',
'warc-date': '2021-02-26T18:49:31Z',
'warc-identified-content-language': 'cym',
'warc-record-id': '<urn:uuid:44d32e62-4240-413a-9f8a-562fe27223c6>',
'warc-refers-to': '<urn:uuid:7d95741c-6974-427f-80f7-d08559f799aa>',
'warc-target-uri': 'https://kw.m.wikipedia.org/wiki/Kembra',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Kembra yw konna-tir menydhek yn Howlsedhes Breten Veur. Glow hag '
'owr o poesek yn erbysieth Pow Kembra seulajydh, mes ...'}
```
#### deduplicated_ky
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 28946,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:TVCYX44AC2J2TBVAYMQW62P4XYHWPSAH',
'warc-date': '2021-02-24T20:28:28Z',
'warc-identified-content-language': 'kir,eng',
'warc-record-id': '<urn:uuid:b0b897b8-5d55-4109-967f-9e368be6b7aa>',
'warc-refers-to': '<urn:uuid:b7ac5729-15cb-44c8-a0a2-096cb46cb1de>',
'warc-target-uri': 'http://mezgilnews.kg/tag/klip/',
'warc-type': 'conversion'},
'nb_sentences': 6,
'offset': 0},
'text': 'Мезгил. Ырчы Зерени соцтармактар аркылуу коркуткан белгисиз '
'адамдарды милиция издеп баштады. Чүй облустук ИИБинин маа...'}
```
#### deduplicated_la
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 2647,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:QXPYMWAXXOOHWKBNAYCNUODKWSB56XU4',
'warc-date': '2021-03-09T04:51:12Z',
'warc-identified-content-language': 'lat,eng',
'warc-record-id': '<urn:uuid:684bcdce-19ec-4a44-b814-949eb5ceff66>',
'warc-refers-to': '<urn:uuid:2cd40ddd-0087-41ba-8442-8b2b6b1bbcd2>',
'warc-target-uri': 'http://grhpay.es/index.php/about-us/',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Nam libero tempore, cum soluta nobis est eligendi optio cumque '
'nihil impedit quo minus id quod maxime placeat facere ...'}
```
#### deduplicated_lb
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 2060,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:5YXISU3T3UP7WKUDJ2W45OAKEFJ7ZD2T',
'warc-date': '2021-03-09T04:51:26Z',
'warc-identified-content-language': 'ltz',
'warc-record-id': '<urn:uuid:534e6ce8-782c-4813-9dfb-902736ffc141>',
'warc-refers-to': '<urn:uuid:5829843c-0428-4098-9213-52bb2fb319b2>',
'warc-target-uri': 'https://online-archive-extractor.com/lb/open-7z-file',
'warc-type': 'conversion'},
'nb_sentences': 4,
'offset': 0},
'text': 'Eis Online Archiv Extraiteren erlaabt Iech den Inhalt vu '
'kompriméierten Archiven direkt aus Ärem Browser ze extrahier...'}
```
#### deduplicated_lez
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 6238,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:4MMTYN2QRKUOUZESCUL3AOZJTMDM5YSY',
'warc-date': '2021-03-02T18:06:44Z',
'warc-identified-content-language': 'nno,eng',
'warc-record-id': '<urn:uuid:78581b3a-c21f-46a2-b168-bff6f147c337>',
'warc-refers-to': '<urn:uuid:02f1447d-0b61-4ad5-ac56-0f42c2438e6b>',
'warc-target-uri': 'https://lez.wikipedia.org/wiki/1877_%D0%B9%D0%B8%D1%81',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': '1877 йис (са агъзурни муьжуьдвишни пудкъанницIеирид лагьай йис) — '
'григорийдин чIаваргандал гьалтайла ислендиз эгечӀза...'}
```
#### deduplicated_li
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 2199,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:IIZSY6KLHN5WSCCGU4NZ6K6WYLIMJP4I',
'warc-date': '2021-03-04T07:19:27Z',
'warc-identified-content-language': 'nld',
'warc-record-id': '<urn:uuid:c7eb18bb-ea03-43c2-a1e9-e8eb5b15e25b>',
'warc-refers-to': '<urn:uuid:486a5d06-6dd8-46d2-a93f-d798b8a5bd07>',
'warc-target-uri': 'https://li.m.wikipedia.org/wiki/Waterop',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': "Hoes Karsveld aan de Gulp sjtamp oet de 18e ièw. 't Kesjtièlechtig "
"hoes ies van mergel mèt 'ne trapgevel. 't Ies gebo..."}
```
#### deduplicated_lmo
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 6553,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:DAJPSPBN7BVZNRWANXQAW2KP6LQEWNUW',
'warc-date': '2021-03-04T10:49:45Z',
'warc-identified-content-language': None,
'warc-record-id': '<urn:uuid:d9452b27-9a95-47e9-8274-518138812f56>',
'warc-refers-to': '<urn:uuid:4ff4e796-c685-4c81-adc9-fecbd50e79cb>',
'warc-target-uri': 'https://lmo.wikipedia.org/wiki/Antrenas',
'warc-type': 'conversion'},
'nb_sentences': 2,
'offset': 0},
'text': "El sò teretóre el g'ha 'na superfìce de 17,55 km² e 'l và de 'na "
"altèsa mìnima de 720 méter a 'na altèsa màsima de 11..."}
```
#### deduplicated_lo
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 15036,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:2FGV42SN72HRKRBEEQ7QJVJBLUYQPCIH',
'warc-date': '2021-03-09T04:48:08Z',
'warc-identified-content-language': 'eng,khm,lao',
'warc-record-id': '<urn:uuid:04d772d6-09db-4d5a-86c8-22b914a35b6f>',
'warc-refers-to': '<urn:uuid:f3cdcafa-5a28-4fbb-81df-7cc5e7bb3248>',
'warc-target-uri': 'http://www.ahealthyme.com/RelatedItems/RelatedDocuments.pg?d=&TypeId=121&ContentId=761&Category=DC',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'ຂໍ້ຄວນໃສ່ໃຈ: ຖ້າເຈົ້າເວົ້າພາສາລາວໄດ້, '
'ມີການບໍລິການຊ່ວຍເຫຼືອດ້ານພາສາໃຫ້ທ່ານໂດຍບໍ່ເສຍຄ່າ. ໂທ ຫາ '
'ຝ່າຍບໍລິການສະ ມາ ຊິກທີ່...'}
```
#### deduplicated_lrc
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 7958,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:GTR6WCXERTVUI5RIKHE7MC7LTACF7R2W',
'warc-date': '2021-03-01T04:48:39Z',
'warc-identified-content-language': 'fas,eng',
'warc-record-id': '<urn:uuid:7ba618e0-f09e-48c2-a0be-a1b77ba5678a>',
'warc-refers-to': '<urn:uuid:2e4504e7-46c9-4aaa-818f-3077c73f1d97>',
'warc-target-uri': 'http://www.shaya.me/2013/01/blog-post_3.html',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'یار یار یار یار یار یار یار یار یار یار یار یار یار یار یار یار یار '
'یار یار یار یار یار یار یار یار یار'}
```
#### deduplicated_lt
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 221005,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:KSLULK6RGSIW43IBMSAEU4643LSRMW3V',
'warc-date': '2021-03-05T07:21:10Z',
'warc-identified-content-language': 'lit',
'warc-record-id': '<urn:uuid:fa6592a5-bc87-4683-88d6-37ce74af5058>',
'warc-refers-to': '<urn:uuid:d78122b4-90d8-4cdf-a205-579bcff9ec88>',
'warc-target-uri': 'https://apcis.ktu.edu/lt/site/katalogas?cat_id=132&type=2',
'warc-type': 'conversion'},
'nb_sentences': 219,
'offset': 0},
'text': 'Telšių apskritis – viena iš Lietuvos sričių, kuri turi ką parodyti '
'pasauliui, ir iš to galima pasiekti didelės naudos...'}
```
#### deduplicated_lv
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 4036,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:NUB75CFJHUBI7HOED4HVCNHGQUIVCBO3',
'warc-date': '2021-03-09T03:46:31Z',
'warc-identified-content-language': 'lav,eng',
'warc-record-id': '<urn:uuid:9ad87feb-993f-45b9-bf1e-53a8185b3dc6>',
'warc-refers-to': '<urn:uuid:64eb85d8-c204-4cf8-a6c3-29760fe1f362>',
'warc-target-uri': 'http://igatesbaznica.lv/augupvrsta-stratijas-binr-opcijas.php',
'warc-type': 'conversion'},
'nb_sentences': 10,
'offset': 0},
'text': 'Latvijā šobrīd nav normatīvu aktu mājas un istabas dzīvnieku '
'vairotāju regulēšanai, jo vairākums audzētāju savu nodar...'}
```
#### deduplicated_mai
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 3632,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:OQRKDLTDWJCD37HVHGXYU7E3BXBR5NB3',
'warc-date': '2021-03-01T16:25:27Z',
'warc-identified-content-language': 'bih,hin,fra',
'warc-record-id': '<urn:uuid:da0cf739-4c6c-46d4-9c32-8e34a673fa26>',
'warc-refers-to': '<urn:uuid:0c39ca75-b871-431b-8c89-63d58ea0893f>',
'warc-target-uri': 'https://mai.m.wikipedia.org/wiki/%E0%A4%B0%E0%A4%BE%E0%A4%9C%E0%A4%A7%E0%A4%BE%E0%A4%A8%E0%A5%80',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'शब्द राजधानी संस्कृत सँ आएल अछि । राजधानी आम तौर पर सङ्घटक क्षेत्रक '
'सब सँ पैग सहर होएत अछि मुदा ई जरुरी नै अछि ।[१]'}
```
#### deduplicated_mg
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 2714,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:OGAHJNKN3OSLXYKJKK2LQAFKAEM67DFQ',
'warc-date': '2021-03-03T15:32:59Z',
'warc-identified-content-language': 'mlg,nno',
'warc-record-id': '<urn:uuid:f5a6492f-29c4-4de9-baaa-12edb86d89cd>',
'warc-refers-to': '<urn:uuid:970362fe-4102-481e-8f4b-db5f3e8ce4db>',
'warc-target-uri': 'https://mg.wikipedia.org/wiki/Barro_Alto_(Bahia)',
'warc-type': 'conversion'},
'nb_sentences': 2,
'offset': 0},
'text': "I Barro Alto (Bahia) dia kaominina ao Brazila, ao amin'i Bahia, ao "
"amin'i Centro-Norte Baiano, Irecê.\n"
'Ny velarantanin...'}
```
#### deduplicated_mhr
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 27685,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:YJYVG5XEYRKALEYIO5PCK34QFNUO3JRD',
'warc-date': '2021-03-06T17:12:45Z',
'warc-identified-content-language': 'rus',
'warc-record-id': '<urn:uuid:3405f528-672f-449c-a2a3-cfa73f5d17b0>',
'warc-refers-to': '<urn:uuid:dfe46be9-656c-4b02-9384-fd1e75987a15>',
'warc-target-uri': 'http://marisong.ru/mar/kalendar',
'warc-type': 'conversion'},
'nb_sentences': 31,
'offset': 0},
'text': '1982 — 1985 ийлаште — Палантай лӱмеш музыкальный училищыште баян '
'дене отделенийыште шинчымашым налын.\n'
'Тыгак шуко жап ...'}
```
#### deduplicated_min
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 4309,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:XV23LOBECSVNRXJ2NJTCZVJXOCVQ3BBR',
'warc-date': '2021-03-08T22:10:36Z',
'warc-identified-content-language': 'eng,spa',
'warc-record-id': '<urn:uuid:fdaddf50-1986-44b3-b84b-d9a5d0fa27f1>',
'warc-refers-to': '<urn:uuid:257f7969-3a19-42d6-ae1a-ddb5c0486bb8>',
'warc-target-uri': 'https://cookingwithmydoctor.com/?LOSS=danger-of-keto-diet%2F',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': '\u200f\u200f\u200e \u200e\u200f\u200f\u200e '
'\u200e\u200f\u200f\u200e \u200e\u200f\u200f\u200e '
'\u200e\u200f\u200f\u200e \u200e\u200f\u200f\u200e '
'\u200e\u200f\u200f\u200e \u200e\u200f\u200f\u200e '
'\u200e\u200f\u200f\u200e \u200e\u200f\u200f\u200e '
'\u200e\u200f\u200f\u200e \u200e\u200f\u200f\u200e '
'\u200e\u200f\u200f\u200e \u200e\u200f\u200f\u200e '
'\u200e\u200f\u200f\u200e \u200e\u200f\u200f\u200e '
'\u200e\u200f\u200f\u200e \u200e\u200f\u200f\u200e '
'\u200e\u200f\u200f\u200e \u200e\u200f\u200f\u200e '
'\u200e\u200f\u200f\u200e \u200e\u200f\u200f\u200e '
'\u200e\u200f\u200f\u200e \u200e\u200f\u200f...'}
```
#### deduplicated_mk
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 22483,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:SGEJ6O6XOEVCQXKXT2XRSRBOSH3ZDSVJ',
'warc-date': '2021-03-02T05:16:16Z',
'warc-identified-content-language': 'mkd,srp,eng',
'warc-record-id': '<urn:uuid:168d1661-a73f-4687-a614-e8cecf7a70a0>',
'warc-refers-to': '<urn:uuid:a61ec44e-a4c1-4b8e-837c-7adc80e853e2>',
'warc-target-uri': 'http://zenica.mk/2018/02/10/tri-dena-kultura-vo-karev-festival/',
'warc-type': 'conversion'},
'nb_sentences': 4,
'offset': 0},
'text': '„Три дена културa“ е настан кој ќе се одржи од 21-23 февруари '
'(среда, четврток и петок, 20:00ч.) во гимназијата „Нико...'}
```
#### deduplicated_ml
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 20202,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:ZOEIO7AIEAGDR2S6TOZYZOAQDOV6QJUE',
'warc-date': '2021-03-08T00:10:05Z',
'warc-identified-content-language': 'mal,eng',
'warc-record-id': '<urn:uuid:f19a2925-0064-47e2-9ec9-48b2786657bd>',
'warc-refers-to': '<urn:uuid:20c7b8fd-1909-480f-b36c-89cd1d0ee3c4>',
'warc-target-uri': 'https://boolokam.com/what-to-do-for-police-clearance-conduct-certificate-in-uae/227247',
'warc-type': 'conversion'},
'nb_sentences': 12,
'offset': 0},
'text': 'രണ്ടുപേര്\u200d തമ്മിലുള്ള സ്നേഹ ബന്ധം അവര്\u200dക്കിടയില്\u200d '
'പൊതുവായി കാണപ്പെടുന്ന മൂല്യങ്ങളുടെ അടിസ്ഥാനത്തില്\u200d '
'ആയിരിക്കും.\n'
'ഒരുവ...'}
```
#### deduplicated_mn
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 5616,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:ILMC56UA63RNTABOJTVMUJQJHMKKC6QR',
'warc-date': '2021-03-09T04:20:37Z',
'warc-identified-content-language': 'mon,ell',
'warc-record-id': '<urn:uuid:07697b69-9e58-4e84-bc0e-a536bcc1ae11>',
'warc-refers-to': '<urn:uuid:704af2f1-3094-45dc-a1c5-63bd08d53069>',
'warc-target-uri': 'http://mn.uncyclopedia.info/index.php?title=%D0%A5%D1%8D%D1%80%D1%8D%D0%B3%D0%BB%D1%8D%D0%B3%D1%87:Mongol_Emperor&action=edit',
'warc-type': 'conversion'},
'nb_sentences': 3,
'offset': 0},
'text': 'Анциклопедиа-д оруулсан бүх хувь нэмэр Creative Commons '
'Attribution-NonCommercial-ShareAlike-н хувьд (дэлгэрэнгүй мэд...'}
```
#### deduplicated_mr
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 11373,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:V3PQES342QGJGRFZ6QMXNB6RIX2ST3V5',
'warc-date': '2021-03-09T05:01:31Z',
'warc-identified-content-language': 'mar,eng',
'warc-record-id': '<urn:uuid:b96cf6ee-7cda-4a7a-9364-08b51284a05e>',
'warc-refers-to': '<urn:uuid:92e533ed-c2c7-4ac7-9b17-af780a503ce6>',
'warc-target-uri': 'https://marathi.thewire.in/devangana-kalita-uapa-bail-rejected-natasha-narwal',
'warc-type': 'conversion'},
'nb_sentences': 9,
'offset': 0},
'text': 'पुण्यातील कार्यक्रमांना स्थगिती:पुण्यातील अनेक सांस्कृतिक नियोजित '
'कार्यक्रमांना स्थगिती, कोरोनाच्या वाढत्या रुग्णांमु...'}
```
#### deduplicated_mrj
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 3492,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:7B242FKI45QVEGJQTF46YCRFYMYW6YFG',
'warc-date': '2021-03-03T05:03:02Z',
'warc-identified-content-language': 'eng',
'warc-record-id': '<urn:uuid:bd7d5682-be60-4a00-9781-29b03a87b30e>',
'warc-refers-to': '<urn:uuid:49641a15-2834-4a72-a011-fdc9cd7273c7>',
'warc-target-uri': 'https://mrj.wikipedia.org/wiki/%D0%91%D0%B0%D1%80%D0%BA%D0%B5%D1%80%D0%B8',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Баркери (латинлӓ Barkeria) – Орхидейвлӓ (Orchidaceae) йыхыш пырышы '
'пеледшӹ кушкыш. Америкышты вӓшлиӓлтеш. Цилӓжӹ 15 й...'}
```
#### deduplicated_ms
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 7939,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:7BWXR4LQ6O2IBJLKLKWJKHTF3JBXB26T',
'warc-date': '2021-03-09T05:38:44Z',
'warc-identified-content-language': 'msa,eng',
'warc-record-id': '<urn:uuid:35a9d91c-3a64-4748-b135-3c467bfa403f>',
'warc-refers-to': '<urn:uuid:9cf4de91-0523-4327-9fcb-5c8f99956da0>',
'warc-target-uri': 'https://kheru2006.livejournal.com/1665383.html',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Bagaimanapun beliau memiliki satu lagi pandangan iaitu perkara '
'paling bodoh seseorang boleh lakukan ialah menjangka d...'}
```
#### deduplicated_mt
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 98714,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:HC75UY5ZHRC3AY4C2VHFR4JADUM2AZBH',
'warc-date': '2021-03-09T04:29:23Z',
'warc-identified-content-language': 'eng,mlt',
'warc-record-id': '<urn:uuid:45dec17d-a638-454e-a136-c45579517b53>',
'warc-refers-to': '<urn:uuid:c82d8d7c-05b6-43d8-be17-5072323aab01>',
'warc-target-uri': 'https://carmelcacopardo.wordpress.com/2015/07/28/',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Kemmuna hi protetta bħala sit Natura 2000. Imma ma nistgħux '
'neskludu logħob tas-soltu biex iduru ma din il-protezzjon...'}
```
#### deduplicated_mwl
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 11598,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:2A22BTIRZ4E5FI2FCG7AUCWJQTY2J4ST',
'warc-date': '2021-02-26T13:58:26Z',
'warc-identified-content-language': None,
'warc-record-id': '<urn:uuid:73a60756-1664-410f-bf62-ab44c88c074f>',
'warc-refers-to': '<urn:uuid:800d3642-449d-4be0-817c-edc7fb64c1b4>',
'warc-target-uri': 'https://mwl.wikipedia.org/wiki/R%C3%A1dio_(quemunica%C3%A7on)',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'La radioquemunicaçon ye un meio de quemunicaçon por trascepçon de '
'anformaçon, podendo ser rializada por Radiaçon eile...'}
```
#### deduplicated_my
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 237288,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:U2QEC6RSZR5UW5LXTNN6QRD47FHVYVJY',
'warc-date': '2021-02-27T06:07:58Z',
'warc-identified-content-language': 'mya,eng',
'warc-record-id': '<urn:uuid:817de4f8-0b7a-446e-bae2-8436019dd34f>',
'warc-refers-to': '<urn:uuid:b364cc33-c1bf-4adb-8317-1aad1cfd4aa0>',
'warc-target-uri': 'http://www.pnsjapan.org/2010/05/',
'warc-type': 'conversion'},
'nb_sentences': 248,
'offset': 0},
'text': 'စတိုင္လည္းက် စမတ္လည္းက်တဲ့ ေန႔စဥ္ လႈပ္ရွားမႈဘဝေလးေတြကို '
'ပိုင္ဆိုင္ႏိုင္ဖို႔အတြက္ Samsung ကေန မၾကာေသးခင္က ထုတ္လုပ္လိုက...'}
```
#### deduplicated_myv
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 11091,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:IFCGUVXSCYHEFYLUVOQ5QMGJWYL2CTVJ',
'warc-date': '2021-03-02T21:05:00Z',
'warc-identified-content-language': 'rus',
'warc-record-id': '<urn:uuid:ea77b8a6-e394-48c1-b865-3cea87e7b906>',
'warc-refers-to': '<urn:uuid:a4927904-4e3c-4f22-858a-adad9bbb1e63>',
'warc-target-uri': 'https://ru.m.wikinews.org/wiki/%D0%9E%D0%BC%D0%B1%D0%BE%D0%BC%D0%B0%D1%81%D1%82%D0%BE%D1%80%D1%81%D0%BE_%C2%AB%D0%90%D0%B7%D0%BE%D1%80%C2%BB_%D1%8D%D1%80%D0%B7%D1%8F%D0%BD%D1%8C_%D1%8D%D1%80%D1%8F%D0%BC%D0%B0%D1%80%D1%82%D0%BE%D0%BD%D1%82%D1%8C_%D0%B2%D0%B0%D1%81%D0%B5%D0%BD%D1%86%D0%B5_%D0%BD%D0%B5%D0%B2%D1%82%D0%B5%D0%BC%D0%B0%D1%81%D1%8C_%D1%8E%D1%82%D1%8B_%D0%A1%D1%83%D0%BE%D0%BC%D0%B8%D1%81%D1%81%D1%8D',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': '«Азор» — васенце эрзянь кельсэ артонь эриванмо-фильманть теемстэ. '
'Орданьбуень Баеньбуе веле, Мордовиясо.'}
```
#### deduplicated_mzn
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 6193,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:QVLHP3APVA34EQ4YFDRJWF2ODTQZ3QG6',
'warc-date': '2021-03-08T00:11:58Z',
'warc-identified-content-language': 'fas',
'warc-record-id': '<urn:uuid:c86dfe2b-795d-4e5d-aaa0-75c1e98690a6>',
'warc-refers-to': '<urn:uuid:b6258701-626d-4a7c-b79e-1c526f9892a6>',
'warc-target-uri': 'https://mzn.wikipedia.org/wiki/%D8%A7%D9%88%D8%B3%D9%88%DA%A9%DB%8C%D8%8C_%D8%A7%D9%88%D8%A6%DB%8C%D8%AA%D8%A7',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'اوسوکی اتا شهر نوم هسته که جاپون ِاوئیتا استان دله دره. ونه جمعیت '
'ره سال ۲۰۰۸ گادِر ۴۲٬۴۶۴ نفر اعلام هاکاردنه. این شه...'}
```
#### deduplicated_nah
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 2517,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:DSXC3C7F2LUL47USAV5ZRT4HMVQ4XGUI',
'warc-date': '2021-03-03T14:32:16Z',
'warc-identified-content-language': 'spa,ell',
'warc-record-id': '<urn:uuid:a305013e-01ba-49a3-89b9-027dc622576f>',
'warc-refers-to': '<urn:uuid:073b9e5a-a0d3-41c3-89bd-8f972b6a4154>',
'warc-target-uri': 'https://nah.wikipedia.org/wiki/%CF%98',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Ϙ ītōcā inic cē huēhuehtlahtōl īpan '
'greciamachiyōtlahtōltecpantiliztli. Ītlahtōl nō ic 90 tlapōhualli.'}
```
#### deduplicated_nap
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 2331,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:EXGUINJCGD2K4E2IVQNJJAQLS4UDJ2TG',
'warc-date': '2021-03-07T13:12:47Z',
'warc-identified-content-language': 'cos,srp,lav',
'warc-record-id': '<urn:uuid:7362689d-31bc-492d-8e60-851c963b5313>',
'warc-refers-to': '<urn:uuid:ecd1bb5f-d247-4739-b9e9-4f93d46081d6>',
'warc-target-uri': 'https://nap.wikipedia.org/wiki/Priatorio',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': "'Int'ô cattolicesimo, priatorio è 'o pruciesso 'e purefecazzione 'e "
"ll'aneme ca moreno 'into ll'amicizzia 'e Dio ma n..."}
```
#### deduplicated_nds
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 5066,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:G2O2EJZLTIU5IDSXMYHPP3TMXVXMAZ3P',
'warc-date': '2021-03-08T22:13:48Z',
'warc-identified-content-language': 'nno,srp',
'warc-record-id': '<urn:uuid:d7f0c9a0-9c12-4d9a-ae5a-184bf7b59c5d>',
'warc-refers-to': '<urn:uuid:31f4d793-f3a4-4403-9c1f-a52f878b63c8>',
'warc-target-uri': 'https://nds.wikipedia.org/wiki/1763',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': '7. Oktober: In London geiht en königliche Proklamatschoon rut, dat '
'vun nu af an in de Kolonien vun Amerika de Kamm vu...'}
```
#### deduplicated_ne
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 17723,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:AZ2CUDZ672TVV2R3O643TJAX7JGXASP2',
'warc-date': '2021-03-08T22:24:08Z',
'warc-identified-content-language': 'nep',
'warc-record-id': '<urn:uuid:fa642413-904a-4def-86fc-a4889e5e9e71>',
'warc-refers-to': '<urn:uuid:f7caed4f-c5ae-4f55-944a-1f06ed71e438>',
'warc-target-uri': 'https://postpati.com/2017/26/07/1353',
'warc-type': 'conversion'},
'nb_sentences': 9,
'offset': 0},
'text': 'युएइको दूतावास बिरुद्द युएइमा रहेका संघ संस्थाहरु द्वारा निरन्तर '
'दवाव आउने क्रमजारि रहेको छ। नेकपा माओबादी सम्बद्ध रह...'}
```
#### deduplicated_new
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 2388,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:E6YZSKQK57PDBRG7VPE64CGOL3N4D63I',
'warc-date': '2021-03-09T04:24:48Z',
'warc-identified-content-language': 'nep,eng,bih',
'warc-record-id': '<urn:uuid:20692995-9d67-4b05-ba9b-9dbac80b4441>',
'warc-refers-to': '<urn:uuid:a8445a70-117a-42c1-89ca-aa5df0cc5616>',
'warc-target-uri': 'https://new.wikipedia.org/wiki/%E0%A4%A7%E0%A4%BE%E0%A4%AA%E0%A4%BE',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'धापा (अंग्रेजी भाय:Dhapa), नेपायागु कर्णाली अञ्चलयागु जुम्ला '
'जिल्लायागु गाँ विकास समिति खः। थ्व थासे231खा छेँ दु।'}
```
#### deduplicated_nl
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 766978,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:77YAXN3F4IGI2CYBM3IESJRTCIB4WY2F',
'warc-date': '2021-02-25T16:49:18Z',
'warc-identified-content-language': 'nld',
'warc-record-id': '<urn:uuid:0b08e51a-1b82-4fb9-a420-8556f2fb47a3>',
'warc-refers-to': '<urn:uuid:dae7ca23-9b7e-45d1-9a1c-604942af8cb9>',
'warc-target-uri': 'https://www.delpher.nl/nl/tijdschriften/view?identifier=MMUBA13:001691001:00689&coll=dts',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': '1 Deze Duitse hond is nauw verwant aan de Duitse Brak, de '
'Westfaalse Dasbrak werd gefokt om op dieren te jagen, zoals...'}
```
#### deduplicated_nn
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 2770,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:FLRYPK225URFXO3IG4LP6D5TI2WW7MNU',
'warc-date': '2021-03-09T03:50:05Z',
'warc-identified-content-language': 'nno',
'warc-record-id': '<urn:uuid:de821d19-abed-4a35-9284-91176a5428b9>',
'warc-refers-to': '<urn:uuid:7ed9913e-e7dd-496f-b0ef-e82098dd53ca>',
'warc-target-uri': 'https://www.avisa-hordaland.no/trafikk/tunell-pa-e16-stengd-2/',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Bilføraren som vart stogga på E16 i helga hadde 2,28 i promille: – '
'Han var ikkje i stand til å ta vare på seg sjølv'}
```
#### deduplicated_no
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 1329,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:G7JC2T5AD4YK4WWFGTYHHGP5VHB6M7KU',
'warc-date': '2021-03-08T13:17:52Z',
'warc-identified-content-language': 'nor',
'warc-record-id': '<urn:uuid:9e215de3-f988-4754-9ef5-6370121b9b5e>',
'warc-refers-to': '<urn:uuid:1facfcb5-da68-4122-9257-102271944050>',
'warc-target-uri': 'https://www.miljoindex.no/781825/nexans-norway-hovedkontor/',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Utvikling, produksjon og markedsføring av kabler og '
'kablingssystemer, samt annen tilknyttet virksomhet, herunder del...'}
```
#### deduplicated_oc
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 20117,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:2XDHRCL2CSS7YFAM2IAGQL6CSJJEQDXI',
'warc-date': '2021-03-03T15:40:21Z',
'warc-identified-content-language': 'oci',
'warc-record-id': '<urn:uuid:c9ebdec5-af68-4756-88c8-1df831621c5b>',
'warc-refers-to': '<urn:uuid:199db451-0e6f-4f75-ad81-2e7612295452>',
'warc-target-uri': 'https://oc.wikipedia.org/wiki/2',
'warc-type': 'conversion'},
'nb_sentences': 18,
'offset': 0},
'text': "8 : dins l'Empèri Part, assassinat dau rèi Orodes III, probablament "
'en causa de son autoritarisme, que foguèt remplaç...'}
```
#### deduplicated_or
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 12859,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:KQDIT6NHKBV43F56DTHTM5ZS3GHJT5SY',
'warc-date': '2021-03-09T05:25:21Z',
'warc-identified-content-language': 'ori,eng',
'warc-record-id': '<urn:uuid:e25e33da-92c5-42d6-aef8-c3465855312a>',
'warc-refers-to': '<urn:uuid:7457ac60-4aae-44ad-aaec-314795ea0708>',
'warc-target-uri': 'https://or.wikipedia.org/wiki/%E0%AC%A6%E0%AD%8D%E0%AD%B1%E0%AC%BF%E0%AC%A4%E0%AD%80%E0%AD%9F_%E0%AC%AC%E0%AC%BF%E0%AC%B6%E0%AD%8D%E0%AD%B1%E0%AC%AF%E0%AD%81%E0%AC%A6%E0%AD%8D%E0%AC%A7',
'warc-type': 'conversion'},
'nb_sentences': 3,
'offset': 0},
'text': 'ଇଉରୋପ, ପ୍ରଶାନ୍ତ ମହାସାଗର, ଆଟଲାଣ୍ଟିକ ମହାସାଗର, ଦକ୍ଷିଣ-ପୂର୍ବ ଏସିଆ, ଚୀନ, '
'ମଧ୍ୟପ୍ରାଚ୍ୟ, ଭୂମଧ୍ୟସାଗର, ଉତ୍ତର ଆଫ୍ରିକା, ପୂର୍ବ ଆଫ୍...'}
```
#### deduplicated_os
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 7079,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:N7CKDF6E3SJBINW4SR6LIUNKLIJP2ROL',
'warc-date': '2021-03-08T22:01:32Z',
'warc-identified-content-language': 'nno',
'warc-record-id': '<urn:uuid:4cd86a68-815b-4539-84a8-bab850034e60>',
'warc-refers-to': '<urn:uuid:8774fb5e-b7fb-4feb-85e7-8c7b33f5980b>',
'warc-target-uri': 'https://os.wikipedia.org/wiki/%D0%9F%D1%83%D1%88%D0%BA%D0%B8%D0%BD,_%D0%A1%D0%B5%D1%80%D0%B3%D0%B5%D0%B9%D1%8B_%D1%84%D1%8B%D1%80%D1%82_%D0%90%D0%BB%D0%B5%D0%BA%D1%81%D0%B0%D0%BD%D0%B4%D1%80',
'warc-type': 'conversion'},
'nb_sentences': 4,
'offset': 0},
'text': 'Пушкин Александр Сергейы фырт (уырыс. Александр Сергеевич Пушкин; '
'райгуырдис 1799 азы 6 июны Мæскуыйы — амардис 1837 ...'}
```
#### deduplicated_pa
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 3990,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:HBYN5XY3CD2KI4XIWMBJYPSV2ZPNBWUN',
'warc-date': '2021-03-09T05:05:20Z',
'warc-identified-content-language': 'pan,eng',
'warc-record-id': '<urn:uuid:1ac5c8d1-e750-492e-b35e-b9780bfd16fd>',
'warc-refers-to': '<urn:uuid:b4d8f997-8c9a-43cf-b16c-e8a77c209062>',
'warc-target-uri': 'https://pa.nhp.gov.in/Detail/getdirection?url=radha-krishna-nurshing-andmat-home-rae_bareli-uttar_pradesh',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'ਇਹ ਪੋਰਟਲ ਰਾਸ਼ਟਰੀ ਸਿਹਤ ਪੋਰਟਲ ਦੇ ਸਿਹਤ ਸੂਚਨਾ ਕੇਂਦਰ (CHI) ਦੁਆਰਾ ਵਿਕਸਿਤ '
'ਤੇ ਤਿਆਰ ਕੀਤਾ ਗਿਆ ਹੈ ਅਤੇ ਸਿਹਤ ਤੇ ਪਰਿਵਾਰ ਭਲਾਈ ਮੰਤਰਾਲੇ...'}
```
#### deduplicated_pam
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 4615,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:WOAFTI75LXN3LAF6WFDRDHITPU33CZRK',
'warc-date': '2021-03-07T22:02:39Z',
'warc-identified-content-language': 'eng',
'warc-record-id': '<urn:uuid:9d7a202a-0fec-4aac-9921-2ebf5aa7f9a2>',
'warc-refers-to': '<urn:uuid:70b6a707-77b1-4a0f-84e6-d75ed8d729ad>',
'warc-target-uri': 'https://toddlers.me/kpai-sarankan-gading-beri-penguatan-psikologi-untuk-gempi/',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': '“Káláu Gádìng tìdák mámpu melákukán ìtu, yá bìsá mìntá tolong '
'kepádá oráng yáng berkompeten, mìsálnyá psìkolog átáu s...'}
```
#### deduplicated_pl
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 51849,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:25YENUTK4YA3ZYGCWQH5Z6YDINCMI6SI',
'warc-date': '2021-03-05T22:43:01Z',
'warc-identified-content-language': 'pol',
'warc-record-id': '<urn:uuid:753116b6-f680-448d-ae8a-8fc88ce061b1>',
'warc-refers-to': '<urn:uuid:926693c4-5b59-4f50-98b9-787576fc71d7>',
'warc-target-uri': 'https://igraszki-jezykowe.pl/category/tips-and-tricks-metodyka/',
'warc-type': 'conversion'},
'nb_sentences': 60,
'offset': 0},
'text': 'W niedzielę, 12 czerwca w Orlando na Florydzie islamski terrorysta, '
'powiązany z ISIS zastrzelił 50 osób i drugie tyle...'}
```
#### deduplicated_pms
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 2620,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:2T5H5XDLC3KPDB33XXVCTGNNYYDJXQWQ',
'warc-date': '2021-03-03T16:04:55Z',
'warc-identified-content-language': 'srp',
'warc-record-id': '<urn:uuid:952c2dda-041e-40ff-bf28-8a39075f53d9>',
'warc-refers-to': '<urn:uuid:6d526022-b736-4a51-9b9c-c5bdd5a546f9>',
'warc-target-uri': 'https://pms.wikipedia.org/wiki/Auer',
'warc-type': 'conversion'},
'nb_sentences': 2,
'offset': 0},
'text': "Auer (Ora për j'italian) a l'é un comun ëd 3.025 abitant dla "
'provincia ëd Bolsan (Region Autònoma Trentin-Sud Tiròl)....'}
```
#### deduplicated_pnb
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 2896,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:GWWDSJAQDB7JDQWV65CI6WT7E6C33DL4',
'warc-date': '2021-03-08T23:01:08Z',
'warc-identified-content-language': 'urd',
'warc-record-id': '<urn:uuid:8c385ca8-7561-4f47-b5a3-0862488eb948>',
'warc-refers-to': '<urn:uuid:837d621d-3540-44fd-a4d0-6cb3c6f2327f>',
'warc-target-uri': 'https://pnb.wikipedia.org/wiki/453%DA%BE',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'لکھت کریئیٹیو کامنز انتساب/ اکوجہے-شراکت لائسنس دے ہیٹھ دستیاب اے، '
'ہور شرطاں وی لاگو ہوسکدیاں نیں۔ ویروے لئی ورتن شرط...'}
```
#### deduplicated_ps
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 2424,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:CAUU5Y7TOTASV7WYKCYRCVXTZ7GGN2VO',
'warc-date': '2021-03-09T05:08:35Z',
'warc-identified-content-language': 'pus',
'warc-record-id': '<urn:uuid:d784cf7a-91e1-4c54-96a2-e41c67318548>',
'warc-refers-to': '<urn:uuid:98aed7d2-c3e3-4039-af83-f2c73a5c19f5>',
'warc-target-uri': 'https://www.mashaalradio.com/a/29821043.html',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'د افغانستان په فاریاب ولایت کې په یوه پارک کې ښځو په برقعو کې ورزش '
'کړی دی. د سیمې چارواکي وايي، د ښځو د ورزش لپاره ځا...'}
```
#### deduplicated_pt
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 79931,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:JYDP4XMEGW2XPPV6NAAF772KDH4X2CCF',
'warc-date': '2021-02-25T13:48:41Z',
'warc-identified-content-language': 'por',
'warc-record-id': '<urn:uuid:3b50f546-e03b-461f-98c8-5a38920d7c0a>',
'warc-refers-to': '<urn:uuid:564bfb21-0705-4997-bbb9-472f0cbcad3e>',
'warc-target-uri': 'http://www.artefazparte.com/',
'warc-type': 'conversion'},
'nb_sentences': 117,
'offset': 0},
'text': 'A reflexão sobre identidade de género anda a cansar muitos de nós. '
'Sobretudo os que não têm dúvidas e nela se sentem ...'}
```
#### deduplicated_qu
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 2630,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:34TX2UNXR2JLRLAFTE3ILOBMEBRMWIRH',
'warc-date': '2021-03-09T05:23:48Z',
'warc-identified-content-language': 'que',
'warc-record-id': '<urn:uuid:237398f6-a300-449b-9e09-7a1ed8cf1e97>',
'warc-refers-to': '<urn:uuid:84b20aab-d538-4efc-bc97-33d546d84802>',
'warc-target-uri': 'https://qu.wikipedia.org/wiki/Sapaq:HukchasqaTinkimuq/Chinchay_Chungcheong_pruwinsya',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': "Kay sapaq p'anqaqa t'inkisqa p'anqakunapi ñaqha hukchasqakunatam "
"rikuchin. Watiqasqayki p'anqakunaqa yanasapa qillqas..."}
```
#### deduplicated_rm
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 100558,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:Z7R6QV2K5FDIHR4QJH7F2NTXND6NDEFY',
'warc-date': '2021-02-27T13:53:32Z',
'warc-identified-content-language': 'deu',
'warc-record-id': '<urn:uuid:da3aec28-6c61-470c-a5d2-66710bc1fb35>',
'warc-refers-to': '<urn:uuid:9d04f371-89a7-4ac2-9b1e-883aa93e4ace>',
'warc-target-uri': 'http://lexbrowser.provinz.bz.it/doc/la/lp-2009-5/lege_provinzialadi_28_de_set_mber_dl_2009_n_5.aspx?view=1',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': '(2) La prestaziun dla garanzia é sotmetüda al’aprovaziun di decunć '
'finanziars da pert dl’aministraziun dl consorz.'}
```
#### deduplicated_ro
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 1677,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:DXKBGKXVETQLCHTHRMLLSWUPXTDNJDVV',
'warc-date': '2021-02-26T12:19:49Z',
'warc-identified-content-language': 'ron',
'warc-record-id': '<urn:uuid:2c20c06f-ca98-4118-9222-7b3b74bc760b>',
'warc-refers-to': '<urn:uuid:e77c028a-5857-4ec2-90db-58a9bb57c510>',
'warc-target-uri': 'https://ro.visafoto.com/es-visa-photo',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Căluşarii sau Boristenii, melodie culeasă din Braşov, în 1832, de '
'Canzler cav. de Ferio şi publicată târziu de Otto H...'}
```
#### deduplicated_ru
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 14025,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:2HSXIFOHEJZOTJV2EVDSZDVF26ATVATE',
'warc-date': '2021-03-07T02:45:16Z',
'warc-identified-content-language': 'rus',
'warc-record-id': '<urn:uuid:aa9b3fc9-fb66-45fa-a064-62ae5fd67970>',
'warc-refers-to': '<urn:uuid:e9145f1e-4ce5-44db-a7d7-234842b31973>',
'warc-target-uri': 'http://budzdorov-kaluga.ru/statyi_i_materialy/o-grippe',
'warc-type': 'conversion'},
'nb_sentences': 15,
'offset': 0},
'text': '«Геро́й» (кит. 英雄) — исторический фильм режиссёра Чжана Имоу, '
'снятый в 2002 году. Продолжительность — 93 минуты (суще...'}
```
#### deduplicated_rue
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 17472,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:YBMO2PR3WF7WQ7UEU5YLRBI7BZ6IP6KB',
'warc-date': '2021-03-06T15:24:27Z',
'warc-identified-content-language': 'ukr,rus',
'warc-record-id': '<urn:uuid:ca71a8fe-adb9-4346-a5b4-7d283f1410f8>',
'warc-refers-to': '<urn:uuid:a609d9f9-5040-4ca5-80a8-aa2c4c7a3525>',
'warc-target-uri': 'https://rue.wikipedia.org/wiki/%D0%9F%D0%BE%D0%BC%D1%96%D1%87:%D0%9A%D0%B0%D1%82%D0%B5%D2%91%D0%BE%D1%80%D1%96%D1%97',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Наприклад можете едітовати Катеґорія:Фізіци і додати одказ '
'[[Катеґорія:Фізіка]]. Катеґорія Фізіци буде пікатеґоріёв к...'}
```
#### deduplicated_sa
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 4166,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:ACZ66HH67HYSPS6I7YYQX64HRD4O5GIH',
'warc-date': '2021-02-24T20:35:30Z',
'warc-identified-content-language': 'san,eng',
'warc-record-id': '<urn:uuid:12bc2393-cb9b-492d-9398-f6b1090bd999>',
'warc-refers-to': '<urn:uuid:6e883bd6-350e-4280-94dc-ee84f44d2458>',
'warc-target-uri': 'https://sa.wikipedia.org/wiki/%E0%A4%B5%E0%A4%BF%E0%A4%B6%E0%A5%87%E0%A4%B7%E0%A4%83:%E0%A4%95%E0%A4%BF%E0%A4%AE%E0%A4%A4%E0%A5%8D%E0%A4%B0_%E0%A4%B8%E0%A4%81%E0%A4%B2%E0%A5%8D%E0%A4%B2%E0%A4%97%E0%A5%8D%E0%A4%A8%E0%A4%AE%E0%A5%8D/%E0%A4%B5%E0%A4%B0%E0%A5%8D%E0%A4%97%E0%A4%83:%E0%A5%A9%E0%A5%AC%E0%A5%A7',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'केभ्यः पृष्ठेभ्यः सम्बद्धम् पृष्ठम्: नामाकाशः : सर्वाणि (मुख्यम्) '
'सम्भाषणम् सदस्यः सदस्यसम्भाषणम् विकिपीडिया विकिपीडि...'}
```
#### deduplicated_sah
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 1724,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:5PKOMLENZCNOU6PT27NCNKTQFPRC37RQ',
'warc-date': '2021-03-03T15:19:03Z',
'warc-identified-content-language': 'ukr,rus',
'warc-record-id': '<urn:uuid:59b7bbeb-e375-4d8c-8b7c-fbe09e5ce21e>',
'warc-refers-to': '<urn:uuid:512d4df0-bd91-47aa-8f23-eb2a8d4b426e>',
'warc-target-uri': 'https://sah.m.wikipedia.org/wiki/%D0%A7%D0%B5%D1%80%D0%BD%D0%B8%D0%B3%D0%BE%D0%B2_%D1%83%D0%BE%D0%B1%D0%B0%D0%BB%D0%B0%D2%BB%D0%B0',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Тиэкис Creative Commons Attribution-ShareAlike лиссиэнсийэ '
'усулуобуйатынан тарҕанар, сорох түбэлтэҕэ эбии көрдөбүллэр...'}
```
#### deduplicated_scn
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 3622,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:VGCXGU3B2WY722G2LRJ56RSYT4HSLUGI',
'warc-date': '2021-03-03T02:35:42Z',
'warc-identified-content-language': 'cos,ita',
'warc-record-id': '<urn:uuid:caeb7ba3-1bc2-4ef7-95cb-eb0d4d0792d6>',
'warc-refers-to': '<urn:uuid:19e33395-5981-4f6d-857b-12cf7d761b58>',
'warc-target-uri': 'https://scn.wikipedia.org/wiki/Canali_d%C3%A2_M%C3%A0nica',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Lu ripartu francisi dâ Mànica, chi cumprenni la pinìsula dû '
'Cotentin, chi si nesci ntô canali, pigghia lu sò nomu dû ...'}
```
#### deduplicated_sco
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 140370,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:TRXAEE4XHP7FT4FCJF3DSEKD7YBPCFOR',
'warc-date': '2021-03-02T07:33:12Z',
'warc-identified-content-language': 'eng,vol',
'warc-record-id': '<urn:uuid:d406a6c9-dba6-4955-8ede-f8082f7da58f>',
'warc-refers-to': '<urn:uuid:155919e0-a689-415c-b2aa-eccd06021476>',
'warc-target-uri': 'https://baggato.com/fo',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'fowjo fowjp fowjq fowjr fowka fowkb fowkc fowkd fowke fowkf fowkg '
'fowkh fowki fowkj fowkk fowkl fowkm fowkn fowko fow...'}
```
#### deduplicated_sd
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 17619,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:DLWVP7WGNP64RB6ZLHDNQEJ7D24BYXOR',
'warc-date': '2021-02-24T20:04:37Z',
'warc-identified-content-language': 'snd,eng',
'warc-record-id': '<urn:uuid:8997e1c6-4d72-47f1-bffe-d18a00ae6b94>',
'warc-refers-to': '<urn:uuid:946e892e-46c3-4a68-8532-1eac8b65b76a>',
'warc-target-uri': 'https://sd.info-4all.ru/%D8%B1%D8%AA%D9%88%D9%BD%D9%88-%D8%A2%D8%A6%D9%8A%D8%B1%D8%B1%D8%A7/%DA%AA%D9%84%D8%A7%DA%AA/',
'warc-type': 'conversion'},
'nb_sentences': 21,
'offset': 0},
'text': 'بيلففيل ڪيئن ٿيو؟ پهرين توهان کي پنهنجو ضمير وڃائڻ جي ضرورت آهي. '
'اهي تعليم کان سواءِ صرف سست ماڻهو نه وٺندا آهن ، پر ...'}
```
#### deduplicated_sh
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 12517,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:IH6O64JAV4PLXURRD5LKU6C46DGGXS27',
'warc-date': '2021-03-09T06:06:53Z',
'warc-identified-content-language': 'fra,hrv,eng',
'warc-record-id': '<urn:uuid:ddc0f982-aea2-4206-a431-02e6c89ab090>',
'warc-refers-to': '<urn:uuid:904a206d-515a-4f11-ad25-9035adbf0cfa>',
'warc-target-uri': 'https://sh.wikipedia.org/wiki/Cliponville',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Po podacima iz 1999. godine u opštini je živelo 245 stanovnika, a '
'gustina naseljenosti je iznosila 33 stanovnika/km²....'}
```
#### deduplicated_si
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 18426,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:CZO426HASJ2VV5IMXEAHY2T53ZTDOZEP',
'warc-date': '2021-02-24T20:38:23Z',
'warc-identified-content-language': 'sin,eng',
'warc-record-id': '<urn:uuid:bec8b1fe-0659-4f47-b244-018b5dac9e30>',
'warc-refers-to': '<urn:uuid:1c918e04-8c2d-4bc0-bcfb-bf978ab0c0ea>',
'warc-target-uri': 'https://androidwedakarayo.com/before-you-look-for-a-job-please-fix-your-facebook-account/',
'warc-type': 'conversion'},
'nb_sentences': 19,
'offset': 0},
'text': 'ඉස්සර තමයි අපි සෝෂල්මීඩියා පාවිච්චි කරන්නේ අපි ආස නළු නිළියන්ගේ '
'ෆොටෝ, හදපු කෑම, ඩ්\u200dරින්ක් එකක් දාන්න සෙට් වෙච්චි වෙලා...'}
```
#### deduplicated_sk
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 37910,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:ODXVMZXR34B45NQTMJIKKK2VGBGRXKEA',
'warc-date': '2021-03-01T16:29:19Z',
'warc-identified-content-language': 'slk',
'warc-record-id': '<urn:uuid:6a22612f-9bbf-4f74-8cca-0457f069baa4>',
'warc-refers-to': '<urn:uuid:3981cb48-fadf-463f-9fc9-a6d717b9dc71>',
'warc-target-uri': 'http://www.tomsta.sk/',
'warc-type': 'conversion'},
'nb_sentences': 56,
'offset': 0},
'text': 'Keďže všade naokolo sú iba kopce, mohol byť jedine horský. Dnes je '
'z toho najlepší horský triatlon na Slovensku, ktor...'}
```
#### deduplicated_sl
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 8130,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:UFZ4P4LVU4TXYJIHZULTCIVJ4GA3JT54',
'warc-date': '2021-03-07T14:50:23Z',
'warc-identified-content-language': 'slv,eng',
'warc-record-id': '<urn:uuid:e50a528d-ebd3-46dc-92d7-af394aaa896a>',
'warc-refers-to': '<urn:uuid:dbfe8ac4-b415-45a8-a16c-c168ed5ce37b>',
'warc-target-uri': 'https://www.edi-nm.com/si/varicosen-mnenja-cena-lekarna/',
'warc-type': 'conversion'},
'nb_sentences': 6,
'offset': 0},
'text': 'Po najnovejših raziskavah v Sloveniji vsaka 4. oseba med 36. in 95. '
'letom trpi zaradi kronične venske insuficience – ...'}
```
#### deduplicated_so
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 17837,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:WIS4GECYGJYMTZMVFOUVUMRWTAPFZUSK',
'warc-date': '2021-03-03T20:11:46Z',
'warc-identified-content-language': 'bul,eng,srp',
'warc-record-id': '<urn:uuid:976de977-97b9-4517-8a42-2fc82fdda461>',
'warc-refers-to': '<urn:uuid:a0f1fbd0-b2cb-495f-93f3-53e77acae3f5>',
'warc-target-uri': 'https://studioqueens.bgnick.info/l4fOorCpgdutsnY/igra-na.html',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'ххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххххх...'}
```
#### deduplicated_sq
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 6129,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:D3PWGEKLJKJEGOTQLYVQNUV4URWEFH2P',
'warc-date': '2021-03-09T03:17:23Z',
'warc-identified-content-language': 'sqi',
'warc-record-id': '<urn:uuid:3299bc56-c7fb-4655-bebd-393510d89aaa>',
'warc-refers-to': '<urn:uuid:1416a2ad-d319-4c60-b663-29239ff79154>',
'warc-target-uri': 'http://ata.gov.al/2019/11/03/video-u-prek-nga-termeti-ndertohet-nga-e-para-banesa-e-familjes-stafa-ne-petrele/',
'warc-type': 'conversion'},
'nb_sentences': 11,
'offset': 0},
'text': 'TIRANË, 3 nëntor/ATSH/- Në Petrelë të Tiranës ka nisur puna për '
'ndërtimin nga e para të shtëpisë së familjes Stafa, e...'}
```
#### deduplicated_sr
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 7735,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:7LKRS7R2L2K53YTV5CYR2IAJRNIQKGBJ',
'warc-date': '2021-03-03T11:23:25Z',
'warc-identified-content-language': 'srp,eng',
'warc-record-id': '<urn:uuid:8ade8406-bedb-41a7-b854-8429b6b21214>',
'warc-refers-to': '<urn:uuid:cca5c75c-7221-4247-a51e-f7be99661793>',
'warc-target-uri': 'https://vojvodjanske.rs/40-jubilarni-somborski-polumaraton-u-nedelju-19-maja/',
'warc-type': 'conversion'},
'nb_sentences': 4,
'offset': 0},
'text': '„У недељу 19. маја, у Сомбору се одржава јубиларна 40. најстарија '
'улична трка у Републици Србији, Сомборски полумарат...'}
```
#### deduplicated_su
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 14013,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:IMFFV646FPXSYLMOATX7O6CDMKUU4BFL',
'warc-date': '2021-03-09T10:29:19Z',
'warc-identified-content-language': 'sun,ind',
'warc-record-id': '<urn:uuid:02eb1f6f-7040-4b8f-b995-7c547196da4b>',
'warc-refers-to': '<urn:uuid:4a9807f7-0c98-493f-ab84-8fafc61a1e50>',
'warc-target-uri': 'https://www.masdinko.com/2019/04/soal-utspts-bahasa-sunda-sd-kelas-4.html',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Pikeun urang lembur, daun seureuh téh geus teu anéh deui. Seureuh '
'mah mangrupa tangkal nu ngarémbét kana tangkal séjéna.'}
```
#### deduplicated_sv
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 87099,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:TKLP6CG56M45ABZQGDD7EDTCQMKTSAVS',
'warc-date': '2021-03-05T20:01:45Z',
'warc-identified-content-language': 'swe',
'warc-record-id': '<urn:uuid:97860695-1688-46ef-93db-5e15742820af>',
'warc-refers-to': '<urn:uuid:7c924b0e-39e1-4921-a561-52dc5453b886>',
'warc-target-uri': 'https://fortretligheter.blogspot.com/2011/01/',
'warc-type': 'conversion'},
'nb_sentences': 255,
'offset': 0},
'text': 'Svenska trupper hade en kväll för flera hundra år sedan när Sverige '
'och Danmark låg i Krig med varandra kommit med sk...'}
```
#### deduplicated_sw
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 2098,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:FPGJP34F47FJQSZF62PELBLYNJ4RTCSE',
'warc-date': '2021-03-03T15:24:39Z',
'warc-identified-content-language': 'swa',
'warc-record-id': '<urn:uuid:d42018de-64be-41f9-b4b6-700dd0051ce3>',
'warc-refers-to': '<urn:uuid:a40c8328-ab33-4113-9ea1-8c35967b0bde>',
'warc-target-uri': 'http://mwanza.go.tz/videos/78',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Mkuu wa Mkoa wa Mwanza Mhe.John Mongella akifungua Baraza la '
'biashara katika kikao kilichofanyika kwenye ukumbi wa mk...'}
```
#### deduplicated_ta
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 49341,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:FQEPDKJ7AYCAEVL5SRUQ5QOULOOSHECD',
'warc-date': '2021-03-09T04:15:52Z',
'warc-identified-content-language': 'tam',
'warc-record-id': '<urn:uuid:2fa70e6a-a31a-4359-b4ff-54ce7f5d6200>',
'warc-refers-to': '<urn:uuid:92eb01ff-4f82-438b-8d1f-1722fe23285a>',
'warc-target-uri': 'https://thiru2050.blogspot.com/2019_05_26_archive.html',
'warc-type': 'conversion'},
'nb_sentences': 15,
'offset': 0},
'text': '... 2017 adimmix psychic leah அறிவுரை கும்பம் மேஷம் ஜோதிடம் '
'புற்றுநோய் மகர படிக குழந்தைகள் மனநோய் புத்தகங்கள் முன்அ...'}
```
#### deduplicated_te
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 31516,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:MG3MFYW5T6XSW3XYZ4ZIKGJW5XAY2RCG',
'warc-date': '2021-03-06T18:07:45Z',
'warc-identified-content-language': 'tel',
'warc-record-id': '<urn:uuid:238b108b-d16e-41d2-b06e-464267352b0e>',
'warc-refers-to': '<urn:uuid:3663318c-d256-4c97-b71b-e4eeb2e6b58a>',
'warc-target-uri': 'https://telugu.greatandhra.com/articles/mbs/ammo-ativa-01-114908.html',
'warc-type': 'conversion'},
'nb_sentences': 15,
'offset': 0},
'text': 'అది 1868. ఇంగ్లండ్\u200cలోని బ్రైటన్\u200cలో క్రిస్టియానా ఎడ్మండ్స్ '
'అనే 40 ఏళ్ల మహిళ వుండేది. పెళ్లి కాలేదు. తల్లితో కలిసి ఒక ఎ...'}
```
#### deduplicated_tg
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 16112,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:LDBVTK3U6MY7J475ZR4LRLFK2CC2QWG5',
'warc-date': '2021-03-09T03:53:03Z',
'warc-identified-content-language': 'tgk,tat,rus',
'warc-record-id': '<urn:uuid:b2519476-6812-4a38-8522-f5292b95e73a>',
'warc-refers-to': '<urn:uuid:f11fa878-d4c6-4e56-bc50-a76554b7d811>',
'warc-target-uri': 'http://hamsafon.tj/2784-imr1263z-1203avoi-1207um1203ur1251-sofu-be1171ubor-meshavad.html',
'warc-type': 'conversion'},
'nb_sentences': 15,
'offset': 0},
'text': 'ДУШАНБЕ, 10.01.2017/АМИТ «Ховар»/. 10 январ дар пойтахти кишвар '
'ҳавои тағйирёбандаи бебориш дар назар дошта шудааст. ...'}
```
#### deduplicated_th
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 50841,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:MESEMAONUQXZZEA6IKBT3VCUZ43ZP4B7',
'warc-date': '2021-02-28T15:41:47Z',
'warc-identified-content-language': 'tha,eng',
'warc-record-id': '<urn:uuid:46495e6b-f22f-4dc6-86ab-3bbed66ce7e4>',
'warc-refers-to': '<urn:uuid:10946c1b-9dc5-4afb-bc74-d6baf9793a03>',
'warc-target-uri': 'https://www.thaicsr.com/2009/02/blog-post_08.html',
'warc-type': 'conversion'},
'nb_sentences': 34,
'offset': 0},
'text': 'ปี พ.ศ. 2521 '
'พระบาทสมเด็จพระเจ้าอยู่หัวเสด็จเยี่ยมราษฎรบ้านพระบาทห้วยต้ม '
'ทรงทอดพระเนตรเห็นสภาพพื้นที่และชีวิตความเป็น...'}
```
#### deduplicated_tk
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 22486,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:VNR5UQCQIGPEZQBZL4VAOQDASFOVNRDL',
'warc-date': '2021-03-03T15:07:09Z',
'warc-identified-content-language': 'eng,rus',
'warc-record-id': '<urn:uuid:b514b9c5-1ccd-4cf0-bea7-ea38a5aef686>',
'warc-refers-to': '<urn:uuid:edf1f6cb-9f46-4790-8256-eb984db0f0d5>',
'warc-target-uri': 'http://www.newscentralasia.net/2020/12/02/move-forward-with-universal-right-and-responsibility/',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Türkmenistanyň Daşary işler ministriniň Owganystanyň Milli Yslam '
'Hereketi partiýasynyň ýolbaşçysy bilen duşuşygy'}
```
#### deduplicated_tl
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 15036,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:2FGV42SN72HRKRBEEQ7QJVJBLUYQPCIH',
'warc-date': '2021-03-09T04:48:08Z',
'warc-identified-content-language': 'eng,khm,lao',
'warc-record-id': '<urn:uuid:04d772d6-09db-4d5a-86c8-22b914a35b6f>',
'warc-refers-to': '<urn:uuid:f3cdcafa-5a28-4fbb-81df-7cc5e7bb3248>',
'warc-target-uri': 'http://www.ahealthyme.com/RelatedItems/RelatedDocuments.pg?d=&TypeId=121&ContentId=761&Category=DC',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'PAUNAWA: Kung nagsasalita ka ng wikang Tagalog, mayroon kang '
'magagamit na mga libreng serbisyo para sa tulong sa wika...'}
```
#### deduplicated_tr
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 14815,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:GVNKVEGK7TMZGXIIMLV2O2YWYJRAKBO2',
'warc-date': '2021-03-04T00:44:44Z',
'warc-identified-content-language': 'tur,eng',
'warc-record-id': '<urn:uuid:7acbe6a8-83c4-4ebd-8d29-62cb0b150b2f>',
'warc-refers-to': '<urn:uuid:038ffe28-2fd1-49b9-a5c6-3dddd1af6318>',
'warc-target-uri': 'https://www.kadikoygitarkursum.com/search/label/g%C3%B6ztepe%20gitar%20dersi',
'warc-type': 'conversion'},
'nb_sentences': 5,
'offset': 0},
'text': 'İlk olarak, bir tek siyah kirpik takımı için fiyat belirleyin, '
"örneğin, 4000 ruble'ye eşittir. Artık bir müşteriyle ç..."}
```
#### deduplicated_tt
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 26112,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:FAPA2JNYP6OL53T6OIL3SR3EGMX2R4XY',
'warc-date': '2021-03-09T04:42:07Z',
'warc-identified-content-language': 'tat,rus',
'warc-record-id': '<urn:uuid:5cac6257-fa6c-4e67-9ba1-8e7d7424ef54>',
'warc-refers-to': '<urn:uuid:52642c8d-da35-462f-9776-ccfa88353466>',
'warc-target-uri': 'http://saby-rt.ru/news/konkurslar/fotokonkurs',
'warc-type': 'conversion'},
'nb_sentences': 12,
'offset': 0},
'text': 'Хөрмәтле хатын-кызларбыз! Сезне чын күңелдән 8 Март бәйрәме белән '
'тәбрик итәбез! Яраткан әниләребез, әбиләребез, гоме...'}
```
#### deduplicated_tyv
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 7766,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:L5GRAANBGMGNYXDFF3ECSWJ5Q6D4QFHS',
'warc-date': '2021-02-28T07:20:44Z',
'warc-identified-content-language': 'rus',
'warc-record-id': '<urn:uuid:238082a9-0adf-4c8c-b749-1a523c91e229>',
'warc-refers-to': '<urn:uuid:4bfd0ca2-52bb-4ece-9ccf-cdcee0b30ee9>',
'warc-target-uri': 'https://tyv.wikipedia.org/wiki/%D0%A1%D0%B0%D1%80%D0%BB%D1%8B%D0%BA',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Сарлык бызаазы – ниити ады, назыны бир хар чедир, сарлыктың эр '
'бызаазы аза сарлыктың кыс бызаазы деп чугаалаар.'}
```
#### deduplicated_ug
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 19089,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:DHYFNWWKECLR6BHWF763HC62JRCASMGH',
'warc-date': '2021-03-09T04:33:38Z',
'warc-identified-content-language': 'uig',
'warc-record-id': '<urn:uuid:d1185989-9cd6-40f2-ad63-003e405c9141>',
'warc-refers-to': '<urn:uuid:923ac168-6484-49ea-807d-be3ced85a885>',
'warc-target-uri': 'https://www.akademiye.org/ug/?p=10959',
'warc-type': 'conversion'},
'nb_sentences': 30,
'offset': 0},
'text': 'شەرقىي تۈركىستانئاكادېمىيە ھەققىدەئەزالىقتەۋپىق '
'مۇكاپاتىئىئانەئالاقەTürkçeEnglishئۇيغۇرچەУйғурчәUyghurche\n'
'مىللىي مەۋج...'}
```
#### deduplicated_uk
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 16706,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:46XDNKJUJSG22BA4B6DDET2R5GMBU3LV',
'warc-date': '2021-02-26T22:04:41Z',
'warc-identified-content-language': 'ukr,eng',
'warc-record-id': '<urn:uuid:a3c68b5a-f9e8-41b6-b2bb-3d43e4d7a117>',
'warc-refers-to': '<urn:uuid:6a35e918-42ce-4349-9a6c-edcd22f07254>',
'warc-target-uri': 'https://www.interesniy.kiev.ua/vasil-boroday-korifey-mistetstva-pla/',
'warc-type': 'conversion'},
'nb_sentences': 14,
'offset': 0},
'text': 'На Женевському міжнародному автосалоні 2017 бренд Fiat буде '
'показувати дві свої душі, які співіснують у великій повні...'}
```
#### deduplicated_ur
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 9450,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:3SZ3UYOSHTRE3W3PDZXRO7DDSLRKENV2',
'warc-date': '2021-03-09T03:21:23Z',
'warc-identified-content-language': 'eng,urd,bos',
'warc-record-id': '<urn:uuid:0ded0cb4-2f73-41a7-a093-5dcfed204738>',
'warc-refers-to': '<urn:uuid:6b380ef1-fec4-4f48-bcdc-86700c508dfc>',
'warc-target-uri': 'http://www.khanaghar.org/?p=50',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'اتراکھنڈ کے سلماتا گاؤں کی لڑائیتی دیوی ایک پُر اعتماد اور عقلمند '
'مجاہد ہیں، جن کی طرف دیگر خواتین بھی دیکھ رہی ہیں۔ ...'}
```
#### deduplicated_uz
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 3808,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:FYYLFGJTK74HXE2LRJOAR5E6BPGCQ5NU',
'warc-date': '2021-03-09T04:38:24Z',
'warc-identified-content-language': 'uzb,ben,ltz',
'warc-record-id': '<urn:uuid:2a56bf64-042e-47fa-9abb-819b13bf7920>',
'warc-refers-to': '<urn:uuid:155b1e81-dc6e-46dc-9544-5a6a97c05118>',
'warc-target-uri': 'https://uz.wikipedia.org/wiki/1408',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Matn Creative Commons Attribution-ShareAlike litsenziyasi boʻyicha '
'ommalashtirilmoqda, alohida holatlarda qoʻshimcha ...'}
```
#### deduplicated_vec
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 7088,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:CX2L4ZL4I4OLXG7YJTXLRKNFHE7RIHRX',
'warc-date': '2021-02-24T19:06:44Z',
'warc-identified-content-language': None,
'warc-record-id': '<urn:uuid:abc5a544-7009-407a-a5a3-5c2145195bd5>',
'warc-refers-to': '<urn:uuid:4a956690-536a-437b-afe2-50dc7ac54b39>',
'warc-target-uri': 'https://vec.wikipedia.org/wiki/Utensa:Aelwyn',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Łe parołe che vien dal łatin -TAS, TATIS łe termina par -DÁ. Łe '
'parołe che łe vien da -ICUS łe tèrmina par -ÉGO. Łe p...'}
```
#### deduplicated_vi
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 7845,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:CCXAI5SV5PFLNPSMP4UF4SQGGSYN37AP',
'warc-date': '2021-03-03T02:43:13Z',
'warc-identified-content-language': 'vie',
'warc-record-id': '<urn:uuid:7ce27f30-a1eb-4978-83d0-5110421393b0>',
'warc-refers-to': '<urn:uuid:5dad988d-2426-402c-ac0c-1fa811ed96dc>',
'warc-target-uri': 'http://httlvinhphuoc.org/vi/duong-linh/Hoc-Kinh-Thanh-hang-ngay/Lam-Dieu-Thien-Bang-Tinh-Yeu-Thuong-6521/',
'warc-type': 'conversion'},
'nb_sentences': 8,
'offset': 0},
'text': 'Bitcoin và tiền kỹ thuật số nói chung đang dần xâm nhập vào các '
'thị trường tài chính khi ngày càng có nhiều nhà đ...'}
```
#### deduplicated_vls
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 78684,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:VQNDJYOQXZLCLMDXIFCT4BHSW6LVTJQE',
'warc-date': '2021-02-28T16:16:27Z',
'warc-identified-content-language': 'fra,eng',
'warc-record-id': '<urn:uuid:266acc08-1c69-449f-95ad-0dcc82565788>',
'warc-refers-to': '<urn:uuid:c45dcd64-1b20-4ffc-bdd7-7dbff4f0a726>',
'warc-target-uri': 'https://fr.readkong.com/page/livret-des-licences-faculte-des-sciences-et-des-techniques-7906239',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': ' '
'...'}
```
#### deduplicated_vo
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 1937,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:VPG56ZACAOAZTXHSSXFJOBBH44NWUSJW',
'warc-date': '2021-03-09T06:02:56Z',
'warc-identified-content-language': 'vol,eng,srp',
'warc-record-id': '<urn:uuid:2cb96947-ee22-42a8-be36-31a03203efcc>',
'warc-refers-to': '<urn:uuid:da82b7d8-535b-4e39-8d9b-ea8c3d4a4460>',
'warc-target-uri': 'https://vo.wikipedia.org/wiki/Arnesano',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'Arnesano binon zif in topäd: Puglia, in Litaliyän. Arnesano topon '
'videtü 40° 20’ N e lunetü 18° 6’ L.'}
```
#### deduplicated_wa
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 6518,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:6NC6V46TRVMWTOHCPMTDVRTP7GGL3G3S',
'warc-date': '2021-02-26T09:47:28Z',
'warc-identified-content-language': 'wol',
'warc-record-id': '<urn:uuid:4d800a25-ccf5-4d55-9795-3f7974b988b1>',
'warc-refers-to': '<urn:uuid:87119673-154b-4246-8c39-35737821a7ff>',
'warc-target-uri': 'https://wa.wikipedia.org/wiki/Senegal',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': "Cisse pådje ci n' est co k' on djermon, dj' ô bén k' el pådje est "
"djusse sibåtcheye, eyet co trop tene; et s' divreut..."}
```
#### deduplicated_war
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 7356,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:SVXPIA63QN77O2IJXL4Q75LNVLDEBHYW',
'warc-date': '2021-03-09T05:49:57Z',
'warc-identified-content-language': 'war,tha,eng',
'warc-record-id': '<urn:uuid:a143ebc6-a7b4-4fa7-96b3-59ba2c1dd03c>',
'warc-refers-to': '<urn:uuid:571d090a-cb65-41e7-ae7c-d95588d41c28>',
'warc-target-uri': 'https://war.wikipedia.org/wiki/Chakri_nga_Dinastiya',
'warc-type': 'conversion'},
'nb_sentences': 2,
'offset': 0},
'text': 'An Chakri nga Dinastiya (Thai: ราชวงศ์จักรี: Rajawongse Chakri) '
'namuno ngan naghadi han Thailand tikang han hi hadi T...'}
```
#### deduplicated_wuu
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 26503,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:XAH2SJIYORGGSMLN4DNJZCNVG2FVWF3C',
'warc-date': '2021-03-09T04:09:05Z',
'warc-identified-content-language': 'jpn',
'warc-record-id': '<urn:uuid:8df3f922-fbbf-4733-a3a8-9f34b7505cbf>',
'warc-refers-to': '<urn:uuid:a55eb04e-3679-4817-b94b-e0317142ab2b>',
'warc-target-uri': 'https://wpedia.goo.ne.jp/wiki/%E4%BC%8A%E5%8D%81%E4%BA%94%E5%9E%8B%E6%BD%9C%E6%B0%B4%E8%89%A6',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': '伊15 [I] | 伊17 | 伊19 | 伊21 | 伊23 | 伊25 | 伊26 | 伊27 | 伊28 | 伊29 | 伊30 '
'| 伊31 | 伊32 | 伊33 | 伊34 | 伊35 | 伊36 | 伊37 | 伊38 |...'}
```
#### deduplicated_xal
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 8598,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:KGZNUXNSFUSFYC45UQJRZPEHXNGK6C3H',
'warc-date': '2021-03-02T01:27:37Z',
'warc-identified-content-language': 'rus,spa',
'warc-record-id': '<urn:uuid:676f6ca8-706b-4f77-926f-bda90e3cd772>',
'warc-refers-to': '<urn:uuid:452efc2f-85ce-4e90-b268-2f46893172f8>',
'warc-target-uri': 'http://born.altnzam.com/2014/01/',
'warc-type': 'conversion'},
'nb_sentences': 2,
'offset': 0},
'text': 'Ааһ: Хоосн ааһ би, хагсхларн һанцардсн болҗ медгдҗәнә. Нанд усн йир '
'кергтә болҗана. Ус өгит, — эзнәсн сурна.\n'
'Ааһ ууль...'}
```
#### deduplicated_xmf
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 7053,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:OQKCWDGQCIJHXMM3SCUO2KPBMFCQACUJ',
'warc-date': '2021-03-03T14:27:35Z',
'warc-identified-content-language': 'kat',
'warc-record-id': '<urn:uuid:e701a584-a14f-49ac-80b3-a7604f98fc92>',
'warc-refers-to': '<urn:uuid:8fc0f735-6e2b-45b2-bee1-bf169e08433b>',
'warc-target-uri': 'https://xmf.wikipedia.org/wiki/%E1%83%99%E1%83%90%E1%83%A2%E1%83%94%E1%83%92%E1%83%9D%E1%83%A0%E1%83%98%E1%83%90:%E1%83%90%E1%83%94%E1%83%A0%E1%83%9D%E1%83%9E%E1%83%9D%E1%83%A0%E1%83%A2%E1%83%94%E1%83%A4%E1%83%98_%E1%83%90%E1%83%9C%E1%83%91%E1%83%90%E1%83%9C%E1%83%98%E1%83%A8_%E1%83%9B%E1%83%94%E1%83%AF%E1%83%98%E1%83%9C%E1%83%90%E1%83%97',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'მოჩამილი ტექსტი წჷმორინელი რე Creative Commons '
'Attribution-ShareAlike ლიცენზიათ; შილებე გეძინელი პირობეფიშ '
'არსებუა. კ...'}
```
#### deduplicated_yi
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 10420,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:CZAVPSCGNW77WY2V2IJNK7R2CCUEMZFB',
'warc-date': '2021-02-24T21:10:52Z',
'warc-identified-content-language': 'yid,eng',
'warc-record-id': '<urn:uuid:7aa9e375-726d-42bd-832a-deee6dce5e4a>',
'warc-refers-to': '<urn:uuid:53354991-7bca-4134-95ce-ce7edebf841b>',
'warc-target-uri': 'http://www.kaveshtiebel.com/viewtopic.php?p=237817',
'warc-type': 'conversion'},
'nb_sentences': 10,
'offset': 0},
'text': 'עמעזאן איז יעצט ארויסגעקומען מיט א נייע סמארט ספיקער סיסטעם. '
"ס'הייסט Echo. אין Echo דרייט זיך א ראבאטישקע זי הייסט אל..."}
```
#### deduplicated_yo
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 3627,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:UISXP36HUEMW2LBTMAR4CTISUYAVZZAD',
'warc-date': '2021-03-07T12:45:52Z',
'warc-identified-content-language': 'yor,eng',
'warc-record-id': '<urn:uuid:e67645e9-ee6c-4c88-9b27-a158dc7f83e9>',
'warc-refers-to': '<urn:uuid:07c8d83b-7840-4238-a3b4-edc3f98ecdd5>',
'warc-target-uri': 'https://edeyorubarewa.com/itelorun/',
'warc-type': 'conversion'},
'nb_sentences': 1,
'offset': 0},
'text': 'A dá sílè fún àwọn ènìyàn tí wọn fẹ́ràn láti mò nípa èdè Yorùbá, '
'àṣà àti ìṣe ilẹ̀ kóòtù ojire. Kíkó àwọn ọmọ wa ni Èd...'}
```
#### deduplicated_zh
* Size of downloaded dataset files: None
* Size of the generated dataset: None
* Total amount of disk used: None
An example of 'train' looks as follows:
```
{ 'id': 0,
'meta': { 'headers': { 'content-length': 108400,
'content-type': 'text/plain',
'warc-block-digest': 'sha1:PP6MQUJB3F4G63HKKGKO2QJG7SMRMTFJ',
'warc-date': '2021-02-28T09:41:11Z',
'warc-identified-content-language': 'zho',
'warc-record-id': '<urn:uuid:132aab53-daff-4bae-83d0-a0cdb4039d00>',
'warc-refers-to': '<urn:uuid:2f26c020-f1fc-4216-a616-4683e0b25b1e>',
'warc-target-uri': 'http://www.yummtumm.com/offer',
'warc-type': 'conversion'},
'nb_sentences': 7,
'offset': 0},
'text': '久久精品视频在线看15_久久人人97超碰_久久爱 '
'人人澡超碰碰中文字幕,人人天天夜夜日日狠狠,久久人人97超碰,人人婷婷开心情五月,日日摸天天摸人人看,碰人人么免费视频,色综合天天综合网 '
'久久爱免费视频在线观看_久久爱视频_久久爱在线...'}
```
</details>
### Data Fields
* `id`: a `int64` feature.
* `meta`: Metadata
* `meta.headers`: WARC Headers
* `meta.headers.content-length`: `int64` Content length (in bytes) **before** cleaning
* `meta.headers.content-type`: `string` MIME type
* `meta.headers.warc-block-digest`:`string` Algorithm name and calculated value of a digest applied to the full block of the record
* `meta.headers.warc-date`: `string` Crawl date (YYYY-MM-DDThh:mm:ssZ)
* `meta.headers.warc-identified-content-language`: `string` Comma-separated list of language identifications done by CommonCrawl (uses CLD3)
* `meta.headers.warc-record-id`: `string` Record ID
* `meta.headers.warc-refers-to`: `string` Record-ID of a single record for which the present record holds additional content
* `meta.headers.warc-target-uri`: `string` URI from where the content has been fetched
* `meta.headers.warc-type`: `string` Type of the WARC Record
* `meta.nb_sentences`: `int64` Number of sentences in the text
* `meta.offset`: `int64` line offset where the related text begins. Should be used with `meta.nb_sentences` when reading the source files rather than using iterators to get related data.
* `text`: `string` content
See the [WARC Format standard](https://iipc.github.io/warc-specifications/specifications/warc-format/warc-1.1/#warc-type-mandatory) for more details.
### Data Splits
<details>
<summary>Click to expand the number of samples per configuration</summary>
## Table
| Language code | language | Size original | words original | size deduplicated | words deduplicated |
|:----|:----------------------------|:-------|:----------------|:---------------|:----------------|
| af | Afrikaans | 258MB | 44,628,392 | 157MB | 27,057,785 |
| als | Alemanic | 7MB | 1,212,699 | 5MB | 871,664 |
| am | Amharic | 405MB | 30,991,914 | 241MB | 18,326,043 |
| an | Aragonese | 1MB | 115,938 | 608KB | 89,043 |
| ar | Arabic | 69GB | 6,494,332,191 | 35GB | 3,365,025,866 |
| arz | Egyptian Arabic | 48MB | 4,998,963 | 21MB | 2,341,904 |
| ast | Asturian | 7MB | 1,085,670 | 4MB | 776,069 |
| as | Assamese | 135MB | 7,917,923 | 95MB | 5,605,207 |
| av | Avaric | 421KB | 25,104 | 325KB | 19,133 |
| azb | South Azerbaijani | 47MB | 3,595,569 | 29MB | 2,243,562 |
| az | Azerbaijani | 3GB | 344,187,319 | 1GB | 169,655,478 |
| bar | Bavarian | 2KB | 247 | 1KB | 245 |
| ba | Bashkir | 110MB | 8,121,603 | 77MB | 5,625,158 |
| be | Belarusian | 2GB | 168,911,341 | 1GB | 98,212,442 |
| bg | Bulgarian | 34GB | 2,994,775,106 | 15GB | 1,315,091,995 |
| bh | Bihari languages | 579KB | 46,436 | 120KB | 9,181 |
| bn | Bangla | 14GB | 814,550,777 | 7GB | 466,289,242 |
| bo | Tibetan | 439MB | 3,751,935 | 358MB | 2,797,085 |
| bpy | Bishnupriya | 11MB | 558,819 | 4MB | 280,825 |
| br | Breton | 49MB | 8,067,480 | 23MB | 4,032,467 |
| bs | Bosnian | 310KB | 50,266 | 175KB | 25,157 |
| bxr | Russia Buriat | 22KB | 1,625 | 18KB | 1,335 |
| ca | Catalan | 13GB | 2,110,833,307 | 6GB | 1,012,770,904 |
| cbk | Chavacano | 168B | 2 | 168B | 2 |
| ceb | Cebuano | 81MB | 12,921,589 | 58MB | 9,201,870 |
| ce | Chechen | 29MB | 2,283,093 | 20MB | 1,638,963 |
| ckb | Central Kurdish | 784MB | 63,417,572 | 367MB | 29,355,017 |
| cs | Czech | 72GB | 9,996,052,434 | 33GB | 4,739,928,730 |
| cv | Chuvash | 60MB | 4,592,449 | 41MB | 3,141,872 |
| cy | Welsh | 307MB | 50,606,998 | 180MB | 30,198,860 |
| da | Danish | 18GB | 2,892,004,180 | 10GB | 1,704,605,898 |
| de | German | 433GB | 58,716,727,164 | 184GB | 25,446,071,671 |
| diq | Dimli (individual language) | 294B | 38 | 147B | 19 |
| dsb | Lower Sorbian | 31KB | 4,115 | 14KB | 1,873 |
| dv | Divehi | 143MB | 8,293,093 | 111MB | 6,481,260 |
| el | Greek | 72GB | 6,024,414,850 | 30GB | 2,539,719,195 |
| eml | Unknown language [eml] | 22KB | 4,360 | 20KB | 3,876 |
| en | English | 2936GB | 488,723,815,522 | 1342GB | 223,669,114,922 |
| eo | Esperanto | 560MB | 84,432,772 | 390MB | 59,411,208 |
| es | Spanish | 342GB | 54,715,337,438 | 160GB | 25,877,724,186 |
| et | Estonian | 7GB | 954,732,803 | 3GB | 455,553,053 |
| eu | Basque | 900MB | 110,676,692 | 503MB | 62,812,888 |
| fa | Persian | 79GB | 8,566,653,720 | 35GB | 3,902,206,854 |
| fi | Finnish | 35GB | 4,074,911,658 | 20GB | 2,357,264,196 |
| frr | Northern Frisian | 7KB | 1,702 | 5KB | 1,267 |
| fr | French | 340GB | 52,839,365,242 | 161GB | 25,245,127,073 |
| fy | Western Frisian | 82MB | 13,094,538 | 57MB | 9,329,828 |
| ga | Irish | 131MB | 20,142,627 | 69MB | 10,835,410 |
| gd | Scottish Gaelic | 2MB | 332,946 | 1MB | 173,588 |
| gl | Galician | 989MB | 155,030,216 | 549MB | 87,015,417 |
| gn | Guarani | 32KB | 3,828 | 25KB | 3,056 |
| gom | Goan Konkani | 3MB | 177,357 | 2MB | 148,801 |
| gu | Gujarati | 1GB | 124,652,589 | 950MB | 63,150,641 |
| gv | Manx | 1KB | 264 | 907B | 141 |
| he | Hebrew | 29GB | 2,829,132,925 | 11GB | 1,156,588,919 |
| hi | Hindi | 26GB | 2,009,754,819 | 13GB | 1,038,914,735 |
| hr | Croatian | 361MB | 51,654,735 | 169MB | 24,583,270 |
| hsb | Upper Sorbian | 2MB | 305,176 | 1MB | 207,715 |
| ht | Haitian Creole | 2KB | 592 | 1KB | 351 |
| hu | Hungarian | 60GB | 7,415,936,687 | 29GB | 3,765,883,306 |
| hy | Armenian | 4GB | 322,429,587 | 1GB | 124,515,953 |
| ia | Interlingua | 291KB | 74,696 | 172KB | 41,625 |
| id | Indonesian | 40GB | 5,767,715,387 | 22GB | 3,126,926,138 |
| ie | Interlingue | 7KB | 1,432 | 2KB | 424 |
| ilo | Iloko | 1MB | 275,029 | 857KB | 140,579 |
| io | Ido | 276KB | 46,463 | 221KB | 36,976 |
| is | Icelandic | 2GB | 290,997,158 | 1GB | 176,018,529 |
| it | Italian | 192GB | 29,252,541,808 | 94GB | 14,426,829,908 |
| ja | Japanese | 208GB | 5,357,000,179 | 96GB | 1,319,938,248 |
| jbo | Lojban | 929KB | 179,684 | 731KB | 140,749 |
| jv | Javanese | 858KB | 121,271 | 728KB | 101,386 |
| ka | Georgian | 6GB | 304,329,117 | 2GB | 116,422,468 |
| kk | Kazakh | 3GB | 236,767,203 | 1GB | 126,886,720 |
| km | Khmer | 1GB | 28,188,612 | 860MB | 13,408,408 |
| kn | Kannada | 2GB | 111,460,546 | 1GB | 56,801,321 |
| ko | Korean | 35GB | 3,367,279,749 | 15GB | 1,475,474,588 |
| krc | Karachay-Balkar | 2MB | 193,207 | 2MB | 153,755 |
| ku | Kurdish | 152MB | 23,845,402 | 108MB | 17,264,310 |
| kv | Komi | 1MB | 89,105 | 588KB | 46,219 |
| kw | Cornish | 119KB | 20,775 | 72KB | 12,687 |
| ky | Kyrgyz | 485MB | 33,401,287 | 334MB | 23,102,129 |
| la | Latin | 103MB | 15,869,314 | 9MB | 1,488,545 |
| lb | Luxembourgish | 54MB | 7,953,887 | 37MB | 5,454,220 |
| lez | Lezghian | 2MB | 214,890 | 2MB | 198,433 |
| li | Limburgish | 76KB | 12,105 | 54KB | 8,472 |
| lmo | Lombard | 1MB | 203,002 | 1MB | 182,533 |
| lo | Lao | 287MB | 6,928,229 | 163MB | 3,620,360 |
| lrc | Northern Luri | 183B | 26 | 183B | 26 |
| lt | Lithuanian | 12GB | 1,573,926,673 | 5GB | 701,326,575 |
| lv | Latvian | 6GB | 799,923,431 | 2GB | 352,753,044 |
| mai | Maithili | 685KB | 144,859 | 24KB | 1,916 |
| mg | Malagasy | 59MB | 8,103,631 | 38MB | 5,220,655 |
| mhr | Eastern Mari | 15MB | 1,170,650 | 10MB | 784,071 |
| min | Minangkabau | 8MB | 451,591 | 1MB | 74,882 |
| mk | Macedonian | 3GB | 261,571,966 | 1GB | 134,544,934 |
| ml | Malayalam | 4GB | 182,898,691 | 2GB | 87,615,430 |
| mn | Mongolian | 1GB | 143,244,180 | 912MB | 71,138,550 |
| mrj | Western Mari | 645KB | 51,812 | 521KB | 41,950 |
| mr | Marathi | 3GB | 173,001,078 | 1GB | 99,858,901 |
| ms | Malay | 146MB | 20,433,250 | 60MB | 8,301,250 |
| mt | Maltese | 51MB | 6,162,888 | 26MB | 3,179,815 |
| mwl | Mirandese | 3KB | 419 | 2KB | 302 |
| my | Burmese | 2GB | 54,624,239 | 1GB | 35,969,724 |
| myv | Erzya | 29KB | 2,844 | 2KB | 236 |
| mzn | Mazanderani | 1MB | 134,128 | 1MB | 106,533 |
| nah | Nahuatl languages | 34KB | 3,664 | 21KB | 2,363 |
| nap | Neapolitan | 1KB | 550 | 1KB | 235 |
| nds | Low German | 25MB | 3,998,912 | 17MB | 2,868,608 |
| ne | Nepali | 3GB | 207,891,824 | 2GB | 142,087,100 |
| new | Newari | 6MB | 433,880 | 4MB | 254,711 |
| nl | Dutch | 97GB | 15,248,924,083 | 47GB | 7,584,055,321 |
| nn | Norwegian Nynorsk | 123MB | 20,629,675 | 66MB | 11,095,804 |
| no | Norwegian Bokmål | 9GB | 1,492,984,384 | 4GB | 776,354,517 |
| oc | Occitan | 12MB | 1,822,595 | 5MB | 834,187 |
| or | Odia | 538MB | 30,838,706 | 357MB | 20,357,839 |
| os | Ossetic | 11MB | 911,794 | 6MB | 536,525 |
| pam | Pampanga | 3KB | 405 | 3KB | 405 |
| pa | Punjabi | 769MB | 59,031,334 | 430MB | 33,413,527 |
| pl | Polish | 122GB | 16,120,806,481 | 48GB | 6,496,098,108 |
| pms | Piedmontese | 4MB | 804,600 | 3MB | 644,017 |
| pnb | Western Panjabi | 68MB | 7,757,785 | 45MB | 5,221,168 |
| ps | Pashto | 404MB | 49,643,597 | 286MB | 35,345,424 |
| pt | Portuguese | 159GB | 24,770,395,312 | 71GB | 11,190,148,216 |
| qu | Quechua | 322KB | 40,691 | 230KB | 29,108 |
| rm | Romansh | 3KB | 512 | 3KB | 429 |
| ro | Romanian | 37GB | 5,629,438,576 | 15GB | 2,387,230,734 |
| rue | Rusyn | 247B | 14 | 247B | 14 |
| ru | Russian | 1201GB | 89,568,364,811 | 542GB | 41,194,052,384 |
| sah | Sakha | 57MB | 2,600,989 | 39MB | 1,944,651 |
| sa | Sanskrit | 72MB | 3,288,786 | 43MB | 1,998,089 |
| scn | Sicilian | 4KB | 712 | 3KB | 516 |
| sco | Scots | 1KB | 523 | 1KB | 282 |
| sd | Sindhi | 75MB | 8,937,427 | 50MB | 6,064,102 |
| sh | Serbian (Latin) | 13MB | 2,164,175 | 9MB | 1,461,045 |
| si | Sinhala | 1GB | 91,456,436 | 791MB | 47,770,919 |
| sk | Slovak | 14GB | 2,002,088,524 | 6GB | 865,456,498 |
| sl | Slovenian | 4GB | 610,843,131 | 1GB | 288,222,997 |
| so | Somali | 15KB | 849 | 13KB | 449 |
| sq | Albanian | 3GB | 493,861,192 | 1GB | 257,278,518 |
| sr | Serbian | 6GB | 574,460,746 | 3GB | 289,211,579 |
| su | Sundanese | 397KB | 54,420 | 274KB | 37,082 |
| sv | Swedish | 43GB | 6,542,433,732 | 19GB | 2,964,887,952 |
| sw | Swahili | 11MB | 1,853,022 | 7MB | 1,279,350 |
| ta | Tamil | 10GB | 438,489,984 | 5GB | 215,856,584 |
| te | Telugu | 3GB | 182,268,133 | 1GB | 73,193,605 |
| tg | Tajik | 985MB | 79,016,232 | 321MB | 26,069,632 |
| th | Thai | 62GB | 1,694,658,532 | 26GB | 635,230,676 |
| tk | Turkmen | 25MB | 2,693,720 | 20MB | 2,221,760 |
| tl | Filipino | 699MB | 115,471,760 | 383MB | 62,473,283 |
| tr | Turkish | 73GB | 8,763,467,387 | 33GB | 3,950,989,357 |
| tt | Tatar | 947MB | 68,793,924 | 424MB | 31,485,000 |
| tyv | Tuvinian | 9KB | 638 | 7KB | 542 |
| ug | Uyghur | 187MB | 12,786,741 | 123MB | 8,410,269 |
| uk | Ukrainian | 53GB | 4,014,675,914 | 28GB | 2,131,491,321 |
| ur | Urdu | 2GB | 354,937,986 | 1GB | 234,111,239 |
| uz | Uzbek | 56MB | 6,237,371 | 28MB | 3,327,595 |
| vec | Venetian | 37KB | 6,694 | 28KB | 5,139 |
| vi | Vietnamese | 87GB | 14,523,772,784 | 42GB | 7,011,404,625 |
| vls | West Flemish | 134B | 2 | 134B | 2 |
| vo | Volapük | 2MB | 426,052 | 2MB | 410,688 |
| war | Waray | 4MB | 750,162 | 4MB | 702,336 |
| wa | Walloon | 511KB | 93,163 | 329KB | 59,906 |
| wuu | Wu Chinese | 145KB | 9,130 | 69KB | 3,031 |
| xal | Kalmyk | 62KB | 5,495 | 62KB | 5,495 |
| xmf | Mingrelian | 16MB | 807,158 | 10MB | 510,700 |
| yi | Yiddish | 199MB | 18,699,112 | 93MB | 8,716,366 |
| yo | Yoruba | 229KB | 34,468 | 120KB | 17,487 |
| zh | Chinese | 500GB | 10,118,381,906 | 266GB | 3,898,987,727 |
</details>
## Dataset Creation
### Curation Rationale
OSCAR was constructed using [`Ungoliant`](https://github.com/oscar-corpus/ungoliant), a new pipeline derived from [goclassy](https://github.com/oscar-corpus/goclassy), itself being derived from [fastText's one](https://github.com/facebookresearch/fastText).
OSCAR 21.09 follows the [OSCAR Schema v1.1](https://oscar-corpus.com/post/oscar-schema-v1-1/), which adds metadata to each entry while staying backwards-compatible with OSCAR.
The order of operations is similar as in the goclassy pipeline, with optimisations regarding IO and a finer granlularity regarding multithreading.
`Ungoliant` is implemented in the [Rust programming language](https://rust-lang.org), and uses [rayon](https://github.com/rayon-rs/rayon) as its data parallelism strategy.
Threading is done at shard, record and sentence level, making the whole generation process much more efficient.
Filtering is done at line-level, removing lines shorter than 100 UTF-8 codepoints. While invalid UTF-8 characters are detected, they are not removed, but rather replaced with the [Replacement character](https://en.wikipedia.org/wiki/Special_(Unicode_block)#Replacement_character).
After all files are proccesed the deduplicated versions are constructed and everything is then splitted in shards and compressed.
### Source Data
#### Initial Data Collection and Normalization
[Common Crawl](https://commoncrawl.org/) is a non-profit foundation which produces and maintains an open repository of web crawled data that is both accessible and analysable. Common Crawl's complete web archive consists of petabytes of data collected over 8 years of web crawling. The repository contains raw web page HTML data (WARC files), metdata extracts (WAT files) and plain text extracts (WET files). The organisation's crawlers has always respected [nofollow](http://microformats.org/wiki/rel-nofollow) and [robots.txt](https://www.robotstxt.org/) policies.
Each monthly Common Crawl snapshot is in itself a massive multilingual corpus, where every single file contains data coming from multiple web pages written in a large variety of languages and covering all possible types of topics.
To construct OSCAR the WET files of Common Crawl were used. These contain the extracted plain texts from the websites mostly converted to UTF-8, as well as headers containing the metatada of each crawled document. Each WET file comes compressed in gzip format and is stored on Amazon Web Services. In the case of OSCAR, the **February 2021** snapshot was used. It is composed by 64 000 compressed text files containing documents and their headers.
#### Who are the source language producers?
The data comes from multiple web pages in a large variety of languages.
### Annotations
The dataset does not contain any additional annotations.
#### Annotation process
N/A
#### Who are the annotators?
N/A
### Personal and Sensitive Information
Being constructed from Common Crawl, Personal and sensitive information might be present. This **must** be considered before training deep learning models with OSCAR, specially in the case of text-generation models.
## Considerations for Using the Data
### Social Impact of Dataset
OSCAR is intended to bring more data to a wide variety of lanuages, the aim of the corpus is to make large amounts of data available to lower resource languages in order to facilitate the pre-training of state-of-the-art language modeling architectures.
### Discussion of Biases
OSCAR is not properly filtered yet and this can be reflected on the models trained with it. Care is advised specially concerning biases of the resulting models.
### Other Known Limitations
The [fastText linear classifier](https://fasttext.cc) is limed both in performance and the variety of languages it can recognize, so the quality of some OSCAR sub-corpora might be lower than expected, specially for the lowest-resource langiuages. Some audits have already been done by [third parties](https://arxiv.org/abs/2010.14571).
## Additional Information
### Dataset Curators
The corpus was put together by [Julien Abadji](https://ujj.space), [Pedro Ortiz Suarez](https://portizs.eu/), [Benoît Sagot](http://pauillac.inria.fr/~sagot/), and [Laurent Romary](https://cv.archives-ouvertes.fr/laurentromary), during work done at [Inria](https://www.inria.fr/en), particularly at the [ALMAnaCH team](https://team.inria.fr/almanach/).
### Licensing Information
These data are released under this licensing scheme
We do not own any of the text from which these data has been extracted.
We license the actual packaging of these data under the Creative Commons CC0 license ("no rights reserved") http://creativecommons.org/publicdomain/zero/1.0/
To the extent possible under law, Inria has waived all copyright and related or neighboring rights to OSCAR
This work is published from: France.
Should you consider that our data contains material that is owned by you and should therefore not be reproduced here, please:
* Clearly identify yourself, with detailed contact data such as an address, telephone number or email address at which you can be contacted.
* Clearly identify the copyrighted work claimed to be infringed.
* Clearly identify the material that is claimed to be infringing and information reasonably sufficient to allow us to locate the material.
We will comply to legitimate requests by removing the affected sources from the next release of the corpus.
### Citation Information
```
@inproceedings{AbadjiOrtizSuarezRomaryetal.2021,
author = {Julien Abadji and Pedro Javier Ortiz Su{\'a}rez and Laurent Romary and Beno{\^i}t Sagot},
title = {Ungoliant: An optimized pipeline for the generation of a very large-scale multilingual web corpus},
series = {Proceedings of the Workshop on Challenges in the Management of Large Corpora (CMLC-9) 2021. Limerick, 12 July 2021 (Online-Event)},
editor = {Harald L{\"u}ngen and Marc Kupietz and Piotr Bański and Adrien Barbaresi and Simon Clematide and Ines Pisetta},
publisher = {Leibniz-Institut f{\"u}r Deutsche Sprache},
address = {Mannheim},
doi = {10.14618/ids-pub-10468},
url = {https://nbn-resolving.org/urn:nbn:de:bsz:mh39-104688},
pages = {1 -- 9},
year = {2021},
abstract = {Since the introduction of large language models in Natural Language Processing, large raw corpora have played a crucial role in Computational Linguistics. However, most of these large raw corpora are either available only for English or not available to the general public due to copyright issues. Nevertheless, there are some examples of freely available multilingual corpora for training Deep Learning NLP models, such as the OSCAR and Paracrawl corpora. However, they have quality issues, especially for low-resource languages. Moreover, recreating or updating these corpora is very complex. In this work, we try to reproduce and improve the goclassy pipeline used to create the OSCAR corpus. We propose a new pipeline that is faster, modular, parameterizable, and well documented. We use it to create a corpus similar to OSCAR but larger and based on recent data. Also, unlike OSCAR, the metadata information is at the document level. We release our pipeline under an open source license and publish the corpus under a research-only license.},
language = {en}
}
@ARTICLE{caswell-etal-2021-quality,
author = {{Caswell}, Isaac and {Kreutzer}, Julia and {Wang}, Lisa and {Wahab}, Ahsan and {van Esch}, Daan and {Ulzii-Orshikh}, Nasanbayar and {Tapo}, Allahsera and {Subramani}, Nishant and {Sokolov}, Artem and {Sikasote}, Claytone and {Setyawan}, Monang and {Sarin}, Supheakmungkol and {Samb}, Sokhar and {Sagot}, Beno{\^\i}t and {Rivera}, Clara and {Rios}, Annette and {Papadimitriou}, Isabel and {Osei}, Salomey and {Ortiz Su{\'a}rez}, Pedro Javier and {Orife}, Iroro and {Ogueji}, Kelechi and {Niyongabo}, Rubungo Andre and {Nguyen}, Toan Q. and {M{\"u}ller}, Mathias and {M{\"u}ller}, Andr{\'e} and {Hassan Muhammad}, Shamsuddeen and {Muhammad}, Nanda and {Mnyakeni}, Ayanda and {Mirzakhalov}, Jamshidbek and {Matangira}, Tapiwanashe and {Leong}, Colin and {Lawson}, Nze and {Kudugunta}, Sneha and {Jernite}, Yacine and {Jenny}, Mathias and {Firat}, Orhan and {Dossou}, Bonaventure F.~P. and {Dlamini}, Sakhile and {de Silva}, Nisansa and {{\c{C}}abuk Ball{\i}}, Sakine and {Biderman}, Stella and {Battisti}, Alessia and {Baruwa}, Ahmed and {Bapna}, Ankur and {Baljekar}, Pallavi and {Abebe Azime}, Israel and {Awokoya}, Ayodele and {Ataman}, Duygu and {Ahia}, Orevaoghene and {Ahia}, Oghenefego and {Agrawal}, Sweta and {Adeyemi}, Mofetoluwa},
title = "{Quality at a Glance: An Audit of Web-Crawled Multilingual Datasets}",
journal = {arXiv e-prints},
keywords = {Computer Science - Computation and Language, Computer Science - Artificial Intelligence},
year = 2021,
month = mar,
eid = {arXiv:2103.12028},
pages = {arXiv:2103.12028},
archivePrefix = {arXiv},
eprint = {2103.12028},
primaryClass = {cs.CL},
adsurl = {https://ui.adsabs.harvard.edu/abs/2021arXiv210312028C},
adsnote = {Provided by the SAO/NASA Astrophysics Data System}
}
@inproceedings{ortiz-suarez-etal-2020-monolingual,
title = "A Monolingual Approach to Contextualized Word Embeddings for Mid-Resource Languages",
author = "Ortiz Su{'a}rez, Pedro Javier and
Romary, Laurent and
Sagot, Benoit",
booktitle = "Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.acl-main.156",
pages = "1703--1714",
abstract = "We use the multilingual OSCAR corpus, extracted from Common Crawl via language classification, filtering and cleaning, to train monolingual contextualized word embeddings (ELMo) for five mid-resource languages. We then compare the performance of OSCAR-based and Wikipedia-based ELMo embeddings for these languages on the part-of-speech tagging and parsing tasks. We show that, despite the noise in the Common-Crawl-based OSCAR data, embeddings trained on OSCAR perform much better than monolingual embeddings trained on Wikipedia. They actually equal or improve the current state of the art in tagging and parsing for all five languages. In particular, they also improve over multilingual Wikipedia-based contextual embeddings (multilingual BERT), which almost always constitutes the previous state of the art, thereby showing that the benefit of a larger, more diverse corpus surpasses the cross-lingual benefit of multilingual embedding architectures.",
}
@inproceedings{OrtizSuarezSagotRomary2019,
author = {Pedro Javier {Ortiz Su{'a}rez} and Benoit Sagot and Laurent Romary},
title = {Asynchronous pipelines for processing huge corpora on medium to low resource infrastructures},
series = {Proceedings of the Workshop on Challenges in the Management of Large Corpora (CMLC-7) 2019. Cardiff, 22nd July 2019},
editor = {Piotr Bański and Adrien Barbaresi and Hanno Biber and Evelyn Breiteneder and Simon Clematide and Marc Kupietz and Harald L{"u}ngen and Caroline Iliadi},
publisher = {Leibniz-Institut f{"u}r Deutsche Sprache},
address = {Mannheim},
doi = {10.14618/ids-pub-9021},
url = {http://nbn-resolving.de/urn:nbn:de:bsz:mh39-90215},
pages = {9 -- 16},
year = {2019},
abstract = {Common Crawl is a considerably large, heterogeneous multilingual corpus comprised of crawled documents from the internet, surpassing 20TB of data and distributed as a set of more than 50 thousand plain text files where each contains many documents written in a wide variety of languages. Even though each document has a metadata block associated to it, this data lacks any information about the language in which each document is written, making it extremely difficult to use Common Crawl for monolingual applications. We propose a general, highly parallel, multithreaded pipeline to clean and classify Common Crawl by language; we specifically design it so that it runs efficiently on medium to low resource infrastructures where I/O speeds are the main constraint. We develop the pipeline so that it can be easily reapplied to any kind of heterogeneous corpus and so that it can be parameterised to a wide range of infrastructures. We also distribute a 6.3TB version of Common Crawl, filtered, classified by language, shuffled at line level in order to avoid copyright issues, and ready to be used for NLP applications.},
language = {en}
}
```
### Contributions
Thanks to [@pjox](https://github.com/pjox), [@Uinelj](https://github.com/Uinelj) and [@lhoestq](https://github.com/lhoestq) for adding this dataset.
|
kalyan003/prompt_qasper_merged | ---
license: unlicense
---
|
autoevaluate/autoeval-staging-eval-squad_v2-squad_v2-38b250-14916077 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad_v2
eval_info:
task: extractive_question_answering
model: deepset/bert-medium-squad2-distilled
metrics: ['bertscore']
dataset_name: squad_v2
dataset_config: squad_v2
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: deepset/bert-medium-squad2-distilled
* Dataset: squad_v2
* Config: squad_v2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nonchalant-nagavalli](https://huggingface.co/nonchalant-nagavalli) for evaluating this model. |
BrookBvn/capitaooo | ---
license: openrail
---
|
anjunhu/CuPL_DaVinci_captioned_CUB2002011_train | ---
dataset_info:
features:
- name: text
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 166122794.75
num_examples: 5994
download_size: 165787380
dataset_size: 166122794.75
---
# Dataset Card for "CuPL_DaVinci_captioned_CUB2002011_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigscience-catalogue-data/lm_en_s2orc_ai2_abstracts | Invalid username or password. |
scholarly360/contracts-extraction-instruction-llm-experiments | ---
dataset_info:
features:
- name: id
dtype: string
- name: instruction
dtype: string
- name: instances
list:
- name: input
dtype: string
- name: output
dtype: string
- name: is_classification
dtype: bool
splits:
- name: train
num_bytes: 3522722
num_examples: 5732
- name: test
num_bytes: 1512840
num_examples: 2487
download_size: 2128894
dataset_size: 5035562
license: apache-2.0
language:
- en
---
# Dataset Card for "contracts-extraction-instruction-llm-experiments"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/mobile_design_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 47497286
num_examples: 100000
download_size: 5304551
dataset_size: 47497286
---
# Dataset Card for "mobile_design_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
diffusers-parti-prompts/sd-v2.1 | ---
dataset_info:
features:
- name: Prompt
dtype: string
- name: Category
dtype: string
- name: Challenge
dtype: string
- name: Note
dtype: string
- name: images
dtype: image
- name: model_name
dtype: string
- name: seed
dtype: int64
splits:
- name: train
num_bytes: 191652463.0
num_examples: 1632
download_size: 191500777
dataset_size: 191652463.0
---
# Images of Parti Prompts for "sd-v2.1"
Code that was used to get the results:
```py
from diffusers import DiffusionPipeline, DDIMScheduler
import torch
import PIL
pipe = DiffusionPipeline.from_pretrained("stabilityai/stable-diffusion-2-1", torch_dtype=torch.float16, safety_checker=None)
pipe.to("cuda")
pipe.scheduler = DDIMScheduler.from_config(pipe.scheduler.config)
prompt = "" # a parti prompt
generator = torch.Generator("cuda").manual_seed(0)
image = pipe(prompt, generator=generator, num_inference_steps=100, guidance_scale=7.5).images[0]
image = image.resize((256, 256), resample=PIL.Image.Resampling.LANCZOS)
```
|
Cosmos-AI/Cosmos-dataset | ---
language:
- en
pretty_name: Cosmos dataset v1
---
v1 |
indonlp/nusaparagraph_rhetoric | ---
license: apache-2.0
---
|
cagliostrolab/860k-ordered-tags-json | ---
license: mit
task_categories:
- text-to-image
language:
- en
tags:
- art
- not-for-all-audiences
size_categories:
- 100K<n<1M
viewer: false
--- |
pavan331999/malay-speech | ---
license: mpl-2.0
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_59 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1314988632.0
num_examples: 258246
download_size: 1336974329
dataset_size: 1314988632.0
---
# Dataset Card for "chunk_59"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Vinnyyw/Anyvoice | ---
license: openrail
---
|
erfanzar/GPT4-8K | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: dialogs
sequence: string
- name: user
sequence: string
- name: assistant
sequence: string
- name: llama2_prompt
dtype: string
splits:
- name: train
num_bytes: 193605433
num_examples: 6144
download_size: 90877640
dataset_size: 193605433
task_categories:
- text-classification
- translation
- conversational
- text-generation
- summarization
language:
- en
pretty_name: GPT4
size_categories:
- 1K<n<10K
---
# Dataset Card for "GPT4-8K"
Sure! Here's a README.md file for your dataset:
# Dataset Description
This dataset was generated using GPT-4, a powerful language model developed by OpenAI. It contains a collection of dialogs between a user and an assistant, along with additional information.
from OpenChat
## Dataset Configurations
The dataset includes the following configurations:
- **Config Name:** default
- **Data Files:**
- **Split:** train
- **Path:** data/train-*
## Dataset Information
The dataset consists of the following features:
- **Dialogs:** A sequence of strings representing the dialog between the user and the assistant.
- **User:** A sequence of strings representing the user's input during the dialog.
- **Assistant:** A sequence of strings representing the assistant's responses during the dialog.
- **Llama2 Prompt:** A string representing additional prompt information related to the Llama2 model.
The dataset is divided into the following splits:
- **Train:**
- **Number of Bytes:** 193,605,433
- **Number of Examples:** 6,144
## Dataset Size and Download
- **Download Size:** 90,877,640 bytes
- **Dataset Size:** 193,605,433 bytes
Please note that this dataset was generated by GPT-4 and may contain synthetic or simulated data. It is intended for research and experimentation purposes.
For more information or inquiries, please contact the dataset owner.
Thank you for using this dataset! |
Kant1/French_Wikinews_articles | ---
task_categories:
- text-generation
language:
- fr
---
Dump of 2023-08-20 of all french article in wikinews
https://dumps.wikimedia.org/frwikinews/20230820/frwikinews-20230820-pages-articles.xml.bz2 |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/9bf6da77 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1340
dataset_size: 182
---
# Dataset Card for "9bf6da77"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sjsq/PrivacyPolicy | ---
license: apache-2.0
---
|
dtruong46me/tokenized-dataset-dialogsum | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 76653920
num_examples: 12460
- name: validation
num_bytes: 3076000
num_examples: 500
- name: test
num_bytes: 9228000
num_examples: 1500
download_size: 5329517
dataset_size: 88957920
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_migtissera__SynthIA-7B-v1.5 | ---
pretty_name: Evaluation run of migtissera/SynthIA-7B-v1.5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [migtissera/SynthIA-7B-v1.5](https://huggingface.co/migtissera/SynthIA-7B-v1.5)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__SynthIA-7B-v1.5_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-09T14:41:56.883085](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__SynthIA-7B-v1.5_public/blob/main/results_2023-11-09T14-41-56.883085.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6291968571108129,\n\
\ \"acc_stderr\": 0.03252538162461919,\n \"acc_norm\": 0.63804599014876,\n\
\ \"acc_norm_stderr\": 0.03323519542303871,\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.5131996962275648,\n\
\ \"mc2_stderr\": 0.015337988977122931,\n \"em\": 0.1875,\n \
\ \"em_stderr\": 0.003997164044486006,\n \"f1\": 0.26010591442953035,\n\
\ \"f1_stderr\": 0.004042449995216609\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398324,\n\
\ \"acc_norm\": 0.6271331058020477,\n \"acc_norm_stderr\": 0.014131176760131172\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6432981477793268,\n\
\ \"acc_stderr\": 0.0047804672709117705,\n \"acc_norm\": 0.833698466440948,\n\
\ \"acc_norm_stderr\": 0.0037159010850549967\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n\
\ \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n\
\ \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02959732973097809,\n \
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02959732973097809\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.818348623853211,\n \"acc_stderr\": 0.016530617409266875,\n \"\
acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266875\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565438,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565438\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.03351953879521272,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.03351953879521272\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128138,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128138\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579828,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579828\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0246853168672578,\n\
\ \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0246853168672578\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.376536312849162,\n\
\ \"acc_stderr\": 0.016204672385106596,\n \"acc_norm\": 0.376536312849162,\n\
\ \"acc_norm_stderr\": 0.016204672385106596\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.02592237178881876,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.02592237178881876\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4589308996088657,\n\
\ \"acc_stderr\": 0.012727084826799797,\n \"acc_norm\": 0.4589308996088657,\n\
\ \"acc_norm_stderr\": 0.012727084826799797\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n\
\ \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \
\ \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.02950489645459596,\n\
\ \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.02950489645459596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774711,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774711\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.5131996962275648,\n\
\ \"mc2_stderr\": 0.015337988977122931\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7924230465666929,\n \"acc_stderr\": 0.01139859341938678\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.1875,\n \"em_stderr\"\
: 0.003997164044486006,\n \"f1\": 0.26010591442953035,\n \"f1_stderr\"\
: 0.004042449995216609\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.17437452615617893,\n\
\ \"acc_stderr\": 0.010451421361976231\n }\n}\n```"
repo_url: https://huggingface.co/migtissera/SynthIA-7B-v1.5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|arc:challenge|25_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|drop|3_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|gsm8k|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hellaswag|10_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T14-41-56.883085.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-09T14-41-56.883085.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- '**/details_harness|winogrande|5_2023-11-09T14-41-56.883085.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-09T14-41-56.883085.parquet'
- config_name: results
data_files:
- split: 2023_11_09T14_41_56.883085
path:
- results_2023-11-09T14-41-56.883085.parquet
- split: latest
path:
- results_2023-11-09T14-41-56.883085.parquet
---
# Dataset Card for Evaluation run of migtissera/SynthIA-7B-v1.5
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/migtissera/SynthIA-7B-v1.5
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [migtissera/SynthIA-7B-v1.5](https://huggingface.co/migtissera/SynthIA-7B-v1.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__SynthIA-7B-v1.5_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-09T14:41:56.883085](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__SynthIA-7B-v1.5_public/blob/main/results_2023-11-09T14-41-56.883085.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6291968571108129,
"acc_stderr": 0.03252538162461919,
"acc_norm": 0.63804599014876,
"acc_norm_stderr": 0.03323519542303871,
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.5131996962275648,
"mc2_stderr": 0.015337988977122931,
"em": 0.1875,
"em_stderr": 0.003997164044486006,
"f1": 0.26010591442953035,
"f1_stderr": 0.004042449995216609
},
"harness|arc:challenge|25": {
"acc": 0.5870307167235495,
"acc_stderr": 0.014388344935398324,
"acc_norm": 0.6271331058020477,
"acc_norm_stderr": 0.014131176760131172
},
"harness|hellaswag|10": {
"acc": 0.6432981477793268,
"acc_stderr": 0.0047804672709117705,
"acc_norm": 0.833698466440948,
"acc_norm_stderr": 0.0037159010850549967
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.02516798233389414,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.02516798233389414
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251976,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251976
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02959732973097809,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02959732973097809
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266875,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266875
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565438,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565438
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.03351953879521272,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.03351953879521272
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128138,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128138
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579828,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579828
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0246853168672578,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0246853168672578
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.376536312849162,
"acc_stderr": 0.016204672385106596,
"acc_norm": 0.376536312849162,
"acc_norm_stderr": 0.016204672385106596
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881876,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881876
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4589308996088657,
"acc_stderr": 0.012727084826799797,
"acc_norm": 0.4589308996088657,
"acc_norm_stderr": 0.012727084826799797
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.02950489645459596,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.02950489645459596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454132,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454132
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774711,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774711
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.5131996962275648,
"mc2_stderr": 0.015337988977122931
},
"harness|winogrande|5": {
"acc": 0.7924230465666929,
"acc_stderr": 0.01139859341938678
},
"harness|drop|3": {
"em": 0.1875,
"em_stderr": 0.003997164044486006,
"f1": 0.26010591442953035,
"f1_stderr": 0.004042449995216609
},
"harness|gsm8k|5": {
"acc": 0.17437452615617893,
"acc_stderr": 0.010451421361976231
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_mrpc_past_been | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 308328
num_examples: 1105
- name: train
num_bytes: 642905
num_examples: 2301
- name: validation
num_bytes: 71879
num_examples: 252
download_size: 668060
dataset_size: 1023112
---
# Dataset Card for "MULTI_VALUE_mrpc_past_been"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
matlok/python-audio-copilot-training-using-inheritance-knowledge-graphs | ---
license:
- other
pretty_name: >-
python copilot audio training using inheritance and polymorphism knowledge graphs
dataset_info:
- config_name: view_schema
splits:
- name: view_schema
configs:
- config_name: view_schema
data_files:
- split: view_schema
path: files/lok-python-copilot-audio.base-v1_00000291.parquet
size_categories:
- 10K<n<100K
tags:
- python-copilot
- python-coding
- python-architecture
- knowledge-graphs
- multimodal
- text-image-audio
- fine-tuning
- training
- question-answering
- image-knowledge-graph
- alpaca
- mp3
- png
- text
- instruct
- inheritance
# supported task_categories
# text-classification, token-classification, table-question-answering, question-answering, zero-shot-classification, translation, summarization, conversational, feature-extraction, text-generation, text2text-generation, fill-mask, sentence-similarity, text-to-speech, text-to-audio, automatic-speech-recognition, audio-to-audio, audio-classification, voice-activity-detection, depth-estimation, image-classification, object-detection, image-segmentation, text-to-image, image-to-text, image-to-image, image-to-video, unconditional-image-generation, video-classification, reinforcement-learning, robotics, tabular-classification, tabular-regression, tabular-to-text, table-to-text, multiple-choice, text-retrieval, time-series-forecasting, text-to-video, visual-question-answering, document-question-answering, zero-shot-image-classification, graph-ml, mask-generation, zero-shot-object-detection, text-to-3d, image-to-3d, other
task_categories:
- text-to-audio
- audio-to-audio
- question-answering
# supported task_ids
# acceptability-classification, entity-linking-classification, fact-checking, intent-classification, language-identification, multi-class-classification, multi-label-classification, multi-input-text-classification, natural-language-inference, semantic-similarity-classification, sentiment-classification, topic-classification, semantic-similarity-scoring, sentiment-scoring, sentiment-analysis, hate-speech-detection, text-scoring, named-entity-recognition, part-of-speech, parsing, lemmatization, word-sense-disambiguation, coreference-resolution, extractive-qa, open-domain-qa, closed-domain-qa, news-articles-summarization, news-articles-headline-generation, dialogue-generation, dialogue-modeling, language-modeling, text-simplification, explanation-generation, abstractive-qa, open-domain-abstractive-qa, closed-domain-qa, open-book-qa, closed-book-qa, slot-filling, masked-language-modeling, keyword-spotting, speaker-identification, audio-intent-classification, audio-emotion-recognition, audio-language-identification, multi-label-image-classification, multi-class-image-classification, face-detection, vehicle-detection, instance-segmentation, semantic-segmentation, panoptic-segmentation, image-captioning, image-inpainting, image-colorization, super-resolution, grasping, task-planning, tabular-multi-class-classification, tabular-multi-label-classification, tabular-single-column-regression, rdf-to-text, multiple-choice-qa, multiple-choice-coreference-resolution, document-retrieval, utterance-retrieval, entity-linking-retrieval, fact-checking-retrieval, univariate-time-series-forecasting, multivariate-time-series-forecasting, visual-question-answering, document-question-answering
task_ids:
- parsing
---
## Python Copilot Audio Training using Inheritance and Polymorphism Knowledge Graphs
This dataset is a subset of the matlok python copilot datasets. Please refer to the [Multimodal Python Copilot Training Overview](https://huggingface.co/datasets/matlok/multimodal-python-copilot-training-overview) for more details on how to use this dataset.
### Details
Each base class for each unique class in each module file has a question and answer mp3 where one voice reads the question and another voice reads the answer. Both mp3s are stored in the parquet **dbytes** column and the associated source code **file_path** identifier.
- Rows: 96874
- Size: 29.9 GB
- Data type: mp3
- Format: narrated alpaca question and answers using two voices
### Schema
```
{
"audio_path": "string",
"audio_type": "string",
"dbytes": "binary",
"dbytes_len": "int64",
"file_path": "string",
"file_path_len": "int64",
"lang": "string",
"lang_len": "int64",
"recsize": "int64"
}
```
### How to use the dataset
```python
from datasets import load_dataset
ds = load_dataset("matlok/python-audio-copilot-training-using-inheritance-knowledge-graphs", data_dir="files")
```
|
TiagoJacobs/test | ---
license: apache-2.0
---
|
nguyenvulebinh/wham | ---
dataset_info:
features:
- name: utterance_id
dtype: string
- name: noise_file
dtype: string
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 4743740197.0
num_examples: 25000
download_size: 4742961559
dataset_size: 4743740197.0
---
# Dataset Card for "wham"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
emozilla/pg19 | ---
dataset_info:
features:
- name: short_book_title
dtype: string
- name: publication_date
dtype: int32
- name: url
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 11453688452
num_examples: 28602
- name: validation
num_bytes: 17402295
num_examples: 50
- name: test
num_bytes: 40482852
num_examples: 100
download_size: 2257437892
dataset_size: 11511573599
---
# Dataset Card for "pg19"
Paraquet version of [pg19](https://huggingface.co/datasets/pg19)
Statistics (in # of characters): `total_len: 11425076324, average_len: 399450.2595622684` |
UCLNLP/sharc | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
- expert-generated
language:
- en
license:
- cc-by-sa-3.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- question-answering
task_ids:
- extractive-qa
paperswithcode_id: sharc
pretty_name: Shaping Answers with Rules through Conversation
tags:
- conversational-qa
dataset_info:
features:
- name: id
dtype: string
- name: utterance_id
dtype: string
- name: source_url
dtype: string
- name: snippet
dtype: string
- name: question
dtype: string
- name: scenario
dtype: string
- name: history
list:
- name: follow_up_question
dtype: string
- name: follow_up_answer
dtype: string
- name: evidence
list:
- name: follow_up_question
dtype: string
- name: follow_up_answer
dtype: string
- name: answer
dtype: string
- name: negative_question
dtype: bool_
- name: negative_scenario
dtype: bool_
config_name: sharc
splits:
- name: train
num_bytes: 15088577
num_examples: 21890
- name: validation
num_bytes: 1469172
num_examples: 2270
download_size: 5230207
dataset_size: 16557749
---
# Dataset Card for Shaping Answers with Rules through Conversation
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [ShARC](https://sharc-data.github.io/index.html)
- **Repository:** [If the dataset is hosted on github or has a github homepage, add URL here]()
- **Paper:** [Interpretation of Natural Language Rules in Conversational Machine Reading](https://arxiv.org/abs/1809.01494)
- **Leaderboard:** [leaderboard](https://sharc-data.github.io/leaderboard.html)
- **Point of Contact:** [Marzieh Saeidi](marzieh.saeidi@gmail.com), [Max Bartolo](maxbartolo@gmail.com), [Patrick Lewis](patrick.s.h.lewis@gmail.com), [Sebastian Riedel](s.riedel@cs.ucl.ac.uk)
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@patil-suraj](https://github.com/patil-suraj) for adding this dataset. |
Cohere/wikipedia-22-12-es-embeddings | ---
annotations_creators:
- expert-generated
language:
- es
multilinguality:
- multilingual
size_categories: []
source_datasets: []
tags: []
task_categories:
- text-retrieval
license:
- apache-2.0
task_ids:
- document-retrieval
---
# Wikipedia (es) embedded with cohere.ai `multilingual-22-12` encoder
We encoded [Wikipedia (es)](https://es.wikipedia.org) using the [cohere.ai](https://txt.cohere.ai/multilingual/) `multilingual-22-12` embedding model.
To get an overview how this dataset was created and pre-processed, have a look at [Cohere/wikipedia-22-12](https://huggingface.co/datasets/Cohere/wikipedia-22-12).
## Embeddings
We compute for `title+" "+text` the embeddings using our `multilingual-22-12` embedding model, a state-of-the-art model that works for semantic search in 100 languages. If you want to learn more about this model, have a look at [cohere.ai multilingual embedding model](https://txt.cohere.ai/multilingual/).
## Further languages
We provide embeddings of Wikipedia in many different languages:
[ar](https://huggingface.co/datasets/Cohere/wikipedia-22-12-ar-embeddings), [de](https://huggingface.co/datasets/Cohere/wikipedia-22-12-de-embeddings), [en](https://huggingface.co/datasets/Cohere/wikipedia-22-12-en-embeddings), [es](https://huggingface.co/datasets/Cohere/wikipedia-22-12-es-embeddings), [fr](https://huggingface.co/datasets/Cohere/wikipedia-22-12-fr-embeddings), [hi](https://huggingface.co/datasets/Cohere/wikipedia-22-12-hi-embeddings), [it](https://huggingface.co/datasets/Cohere/wikipedia-22-12-it-embeddings), [ja](https://huggingface.co/datasets/Cohere/wikipedia-22-12-ja-embeddings), [ko](https://huggingface.co/datasets/Cohere/wikipedia-22-12-ko-embeddings), [simple english](https://huggingface.co/datasets/Cohere/wikipedia-22-12-simple-embeddings), [zh](https://huggingface.co/datasets/Cohere/wikipedia-22-12-zh-embeddings),
You can find the Wikipedia datasets without embeddings at [Cohere/wikipedia-22-12](https://huggingface.co/datasets/Cohere/wikipedia-22-12).
## Loading the dataset
You can either load the dataset like this:
```python
from datasets import load_dataset
docs = load_dataset(f"Cohere/wikipedia-22-12-es-embeddings", split="train")
```
Or you can also stream it without downloading it before:
```python
from datasets import load_dataset
docs = load_dataset(f"Cohere/wikipedia-22-12-es-embeddings", split="train", streaming=True)
for doc in docs:
docid = doc['id']
title = doc['title']
text = doc['text']
emb = doc['emb']
```
## Search
A full search example:
```python
#Run: pip install cohere datasets
from datasets import load_dataset
import torch
import cohere
co = cohere.Client(f"<<COHERE_API_KEY>>") # Add your cohere API key from www.cohere.com
#Load at max 1000 documents + embeddings
max_docs = 1000
docs_stream = load_dataset(f"Cohere/wikipedia-22-12-es-embeddings", split="train", streaming=True)
docs = []
doc_embeddings = []
for doc in docs_stream:
docs.append(doc)
doc_embeddings.append(doc['emb'])
if len(docs) >= max_docs:
break
doc_embeddings = torch.tensor(doc_embeddings)
query = 'Who founded Youtube'
response = co.embed(texts=[query], model='multilingual-22-12')
query_embedding = response.embeddings
query_embedding = torch.tensor(query_embedding)
# Compute dot score between query embedding and document embeddings
dot_scores = torch.mm(query_embedding, doc_embeddings.transpose(0, 1))
top_k = torch.topk(dot_scores, k=3)
# Print results
print("Query:", query)
for doc_id in top_k.indices[0].tolist():
print(docs[doc_id]['title'])
print(docs[doc_id]['text'], "\n")
```
## Performance
You can find performance on the MIRACL dataset (a semantic search evaluation dataset) here: [miracl-en-queries-22-12#performance](https://huggingface.co/datasets/Cohere/miracl-en-queries-22-12#performance) |
damerajee/IMDB-sentiment-reviews | ---
license: mit
task_categories:
- text-classification
language:
- en
tags:
- art
pretty_name: IMDB -reviews-Sentiment
size_categories:
- 10K<n<100K
---
# This datasets is contains reviews on movies on IMDB
## Columns include :
- review
- sentiment
# what can we do with this datasets
- perform fine tuning using your preferred models
- text -generation
# More rows and column might be added |
ManuelAlv/PubMed | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 22699692
num_examples: 135030
- name: validation
num_bytes: 5673744
num_examples: 33757
- name: test
num_bytes: 1895905
num_examples: 11253
download_size: 18142349
dataset_size: 30269341
---
# Dataset Card for "PubMed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
malucoelhaofc/PomniV2 | ---
license: openrail
---
|
Mitsuki-Sakamoto/alpaca_farm-deberta-re-pref-64-fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.0_seed_3_tp_0.3 | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43644703
num_examples: 18928
- name: epoch_1
num_bytes: 44104341
num_examples: 18928
- name: epoch_2
num_bytes: 44179810
num_examples: 18928
- name: epoch_3
num_bytes: 44222641
num_examples: 18928
- name: epoch_4
num_bytes: 44248171
num_examples: 18928
- name: epoch_5
num_bytes: 44260511
num_examples: 18928
- name: epoch_6
num_bytes: 44261058
num_examples: 18928
- name: epoch_7
num_bytes: 44256480
num_examples: 18928
- name: epoch_8
num_bytes: 44252617
num_examples: 18928
- name: epoch_9
num_bytes: 44251282
num_examples: 18928
- name: epoch_10
num_bytes: 44252114
num_examples: 18928
- name: epoch_11
num_bytes: 44252797
num_examples: 18928
- name: epoch_12
num_bytes: 44250821
num_examples: 18928
- name: epoch_13
num_bytes: 44251405
num_examples: 18928
- name: epoch_14
num_bytes: 44251113
num_examples: 18928
- name: epoch_15
num_bytes: 44252322
num_examples: 18928
- name: epoch_16
num_bytes: 44252800
num_examples: 18928
- name: epoch_17
num_bytes: 44251498
num_examples: 18928
- name: epoch_18
num_bytes: 44250822
num_examples: 18928
- name: epoch_19
num_bytes: 44250117
num_examples: 18928
- name: epoch_20
num_bytes: 44249659
num_examples: 18928
- name: epoch_21
num_bytes: 44249958
num_examples: 18928
- name: epoch_22
num_bytes: 44250697
num_examples: 18928
- name: epoch_23
num_bytes: 44249960
num_examples: 18928
- name: epoch_24
num_bytes: 44250560
num_examples: 18928
- name: epoch_25
num_bytes: 44250346
num_examples: 18928
- name: epoch_26
num_bytes: 44250820
num_examples: 18928
- name: epoch_27
num_bytes: 44249515
num_examples: 18928
- name: epoch_28
num_bytes: 44249415
num_examples: 18928
- name: epoch_29
num_bytes: 44250535
num_examples: 18928
download_size: 680749796
dataset_size: 1326698888
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
---
|
open-llm-leaderboard/details_rinna__youri-7b-chat | ---
pretty_name: Evaluation run of rinna/youri-7b-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [rinna/youri-7b-chat](https://huggingface.co/rinna/youri-7b-chat) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 1 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rinna__youri-7b-chat\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-02T15:12:23.080545](https://huggingface.co/datasets/open-llm-leaderboard/details_rinna__youri-7b-chat/blob/main/results_2023-12-02T15-12-23.080545.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.013646702047005308,\n\
\ \"acc_stderr\": 0.0031957470754808235\n },\n \"harness|gsm8k|5\"\
: {\n \"acc\": 0.013646702047005308,\n \"acc_stderr\": 0.0031957470754808235\n\
\ }\n}\n```"
repo_url: https://huggingface.co/rinna/youri-7b-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_02T15_11_37.192628
path:
- '**/details_harness|gsm8k|5_2023-12-02T15-11-37.192628.parquet'
- split: 2023_12_02T15_11_43.336973
path:
- '**/details_harness|gsm8k|5_2023-12-02T15-11-43.336973.parquet'
- split: 2023_12_02T15_11_58.617219
path:
- '**/details_harness|gsm8k|5_2023-12-02T15-11-58.617219.parquet'
- split: 2023_12_02T15_12_23.080545
path:
- '**/details_harness|gsm8k|5_2023-12-02T15-12-23.080545.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-02T15-12-23.080545.parquet'
- config_name: results
data_files:
- split: 2023_12_02T15_11_37.192628
path:
- results_2023-12-02T15-11-37.192628.parquet
- split: 2023_12_02T15_11_43.336973
path:
- results_2023-12-02T15-11-43.336973.parquet
- split: 2023_12_02T15_11_58.617219
path:
- results_2023-12-02T15-11-58.617219.parquet
- split: 2023_12_02T15_12_23.080545
path:
- results_2023-12-02T15-12-23.080545.parquet
- split: latest
path:
- results_2023-12-02T15-12-23.080545.parquet
---
# Dataset Card for Evaluation run of rinna/youri-7b-chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/rinna/youri-7b-chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [rinna/youri-7b-chat](https://huggingface.co/rinna/youri-7b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 1 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rinna__youri-7b-chat",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T15:12:23.080545](https://huggingface.co/datasets/open-llm-leaderboard/details_rinna__youri-7b-chat/blob/main/results_2023-12-02T15-12-23.080545.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.013646702047005308,
"acc_stderr": 0.0031957470754808235
},
"harness|gsm8k|5": {
"acc": 0.013646702047005308,
"acc_stderr": 0.0031957470754808235
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
DanielSongShen/CLIP-food101-image-dataset-medium_latents | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': apple_pie
'1': baby_back_ribs
'2': baklava
'3': beef_carpaccio
'4': beef_tartare
'5': beet_salad
'6': beignets
'7': bibimbap
'8': bread_pudding
'9': breakfast_burrito
'10': bruschetta
'11': caesar_salad
'12': cannoli
'13': caprese_salad
'14': carrot_cake
'15': ceviche
'16': cheesecake
'17': cheese_plate
'18': chicken_curry
'19': chicken_quesadilla
'20': chicken_wings
'21': chocolate_cake
'22': chocolate_mousse
'23': churros
'24': clam_chowder
'25': club_sandwich
'26': crab_cakes
'27': creme_brulee
'28': croque_madame
'29': cup_cakes
'30': deviled_eggs
'31': donuts
'32': dumplings
'33': edamame
'34': eggs_benedict
'35': escargots
'36': falafel
'37': filet_mignon
'38': fish_and_chips
'39': foie_gras
'40': french_fries
'41': french_onion_soup
'42': french_toast
'43': fried_calamari
'44': fried_rice
'45': frozen_yogurt
'46': garlic_bread
'47': gnocchi
'48': greek_salad
'49': grilled_cheese_sandwich
'50': grilled_salmon
'51': guacamole
'52': gyoza
'53': hamburger
'54': hot_and_sour_soup
'55': hot_dog
'56': huevos_rancheros
'57': hummus
'58': ice_cream
'59': lasagna
'60': lobster_bisque
'61': lobster_roll_sandwich
'62': macaroni_and_cheese
'63': macarons
'64': miso_soup
'65': mussels
'66': nachos
'67': omelette
'68': onion_rings
'69': oysters
'70': pad_thai
'71': paella
'72': pancakes
'73': panna_cotta
'74': peking_duck
'75': pho
'76': pizza
'77': pork_chop
'78': poutine
'79': prime_rib
'80': pulled_pork_sandwich
'81': ramen
'82': ravioli
'83': red_velvet_cake
'84': risotto
'85': samosa
'86': sashimi
'87': scallops
'88': seaweed_salad
'89': shrimp_and_grits
'90': spaghetti_bolognese
'91': spaghetti_carbonara
'92': spring_rolls
'93': steak
'94': strawberry_shortcake
'95': sushi
'96': tacos
'97': takoyaki
'98': tiramisu
'99': tuna_tartare
'100': waffles
- name: CLIP_image_latent
sequence:
sequence: float32
splits:
- name: train
num_bytes: 695217996.0
num_examples: 16000
- name: test
num_bytes: 175124282.0
num_examples: 4000
download_size: 890685739
dataset_size: 870342278.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
hundredeuk2/ranking_data | ---
dataset_info:
features:
- name: question
dtype: string
- name: response_j
dtype: string
- name: response_k
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 84003012
num_examples: 67830
download_size: 9031121
dataset_size: 84003012
---
# Dataset Card for "ranking_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AiHevenpen/setup | ---
license: mit
tags:
- music
pretty_name: Decmus WebUI
---
decmus webui |
johannes-garstenauer/ENN_masking_embeddings_dim_512 | ---
dataset_info:
features:
- name: last_hs
sequence: float32
- name: label
dtype: int64
splits:
- name: train
num_bytes: 138580320
num_examples: 67272
download_size: 177638515
dataset_size: 138580320
---
# Dataset Card for "ENN_masking_embeddings_dim_512"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Augustya07/neitzsche_beyond_good_and_evil_convo | ---
license: mit
---
|
ibranze/araproje_hellaswag_tr_s2 | ---
dataset_info:
features:
- name: ind
dtype: int32
- name: activity_label
dtype: string
- name: ctx_a
dtype: string
- name: ctx_b
dtype: string
- name: ctx
dtype: string
- name: endings
sequence: string
- name: source_id
dtype: string
- name: split
dtype: string
- name: split_type
dtype: string
- name: label
dtype: string
splits:
- name: validation
num_bytes: 162703.0
num_examples: 250
download_size: 88572
dataset_size: 162703.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_hellaswag_tr_s2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sushantmenon123/Kathakali | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 71728.0
num_examples: 5
download_size: 72596
dataset_size: 71728.0
---
# Dataset Card for "Kathakali"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Paul/hatecheck-polish | ---
annotations_creators:
- crowdsourced
language_creators:
- expert-generated
language:
- pl
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: Polish HateCheck
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- hate-speech-detection
---
# Dataset Card for Multilingual HateCheck
## Dataset Description
Multilingual HateCheck (MHC) is a suite of functional tests for hate speech detection models in 10 different languages: Arabic, Dutch, French, German, Hindi, Italian, Mandarin, Polish, Portuguese and Spanish.
For each language, there are 25+ functional tests that correspond to distinct types of hate and challenging non-hate.
This allows for targeted diagnostic insights into model performance.
For more details, please refer to our paper about MHC, published at the 2022 Workshop on Online Abuse and Harms (WOAH) at NAACL 2022. If you are using MHC, please cite our work!
- **Paper:** Röttger et al. (2022) - Multilingual HateCheck: Functional Tests for Multilingual Hate Speech Detection Models. https://arxiv.org/abs/2206.09917
- **Repository:** https://github.com/rewire-online/multilingual-hatecheck
- **Point of Contact:** paul@rewire.online
## Dataset Structure
The csv format mostly matches the original HateCheck data, with some adjustments for specific languages.
**mhc_case_id**
The test case ID that is unique to each test case across languages (e.g., "mandarin-1305")
**functionality**
The shorthand for the functionality tested by the test case (e.g, "target_obj_nh"). The same functionalities are tested in all languages, except for Mandarin and Arabic, where non-Latin script required adapting the tests for spelling variations.
**test_case**
The test case text.
**label_gold**
The gold standard label ("hateful" or "non-hateful") of the test case. All test cases within a given functionality have the same gold standard label.
**target_ident**
Where applicable, the protected group that is targeted or referenced in the test case. All HateChecks cover seven target groups, but their composition varies across languages.
**ref_case_id**
For hateful cases, where applicable, the ID of the hateful case which was perturbed to generate this test case. For non-hateful cases, where applicable, the ID of the hateful case which is contrasted by this test case.
**ref_templ_id**
The equivalent to ref_case_id, but for template IDs.
**templ_id**
The ID of the template from which the test case was generated.
**case_templ**
The template from which the test case was generated (where applicable).
**gender_male** and **gender_female**
For gender-inflected languages (French, Spanish, Portuguese, Hindi, Arabic, Italian, Polish, German), only for cases where gender inflection is relevant, separate entries for gender_male and gender_female replace case_templ.
**label_annotated**
A list of labels given by the three annotators who reviewed the test case (e.g., "['hateful', 'hateful', 'hateful']").
**label_annotated_maj**
The majority vote of the three annotators (e.g., "hateful"). In some cases this differs from the gold label given by our language experts.
**disagreement_in_case**
True if label_annotated_maj does not match label_gold for the entry.
**disagreement_in_template**
True if the test case is generated from an IDENT template and there is at least one case with disagreement_in_case generated from the same template. This can be used to exclude entire templates from MHC. |
nancyalarabawy/PlantLeafDiseases_images | ---
dataset_info:
features:
- name: name
dtype: string
- name: uuid
dtype: string
- name: status
dtype: string
- name: label.annotations
list:
- name: id
dtype: int32
- name: category_id
dtype: int32
- name: label.segmentation_bitmap.url
dtype: string
- name: image
dtype: image
splits:
- name: images
num_bytes: 3910841935.0
num_examples: 4000
download_size: 3910468090
dataset_size: 3910841935.0
configs:
- config_name: default
data_files:
- split: images
path: data/images-*
---
|
heliosprime/twitter_dataset_1713230496 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 19708
num_examples: 57
download_size: 18304
dataset_size: 19708
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713230496"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
quocanh34/viet_vlsp | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 24074955754.41959
num_examples: 171441
- name: validation
num_bytes: 1053341643.8704103
num_examples: 7501
download_size: 25080680499
dataset_size: 25128297398.29
---
# Dataset Card for "viet_vlsp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
susnato/test-squad | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: train
num_bytes: 79346108
num_examples: 87599
download_size: 0
dataset_size: 79346108
---
# Dataset Card for "test-squad"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mstz/contraceptive | ---
language:
- en
tags:
- contraceptive
- tabular_classification
- binary_classification
- UCI
pretty_name: Contraceptive evaluation
size_categories:
- 1K<n<10K
task_categories:
- tabular-classification
configs:
- contraceptive
license: cc
---
# Contraceptive
The [Contraceptive dataset](https://archive-beta.ics.uci.edu/dataset/30/contraceptive+method+choice) from the [UCI repository](https://archive-beta.ics.uci.edu).
Does the couple use contraceptives?
# Configurations and tasks
| **Configuration** | **Task** | **Description** |
|-------------------|---------------------------|-------------------------|
| contraceptive | Binary classification | Does the couple use contraceptives?|
# Usage
```python
from datasets import load_dataset
dataset = load_dataset("mstz/contraceptive", "contraceptive")["train"]
``` |
killah-t-cell/boxes_test_controlnet_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 212085.0
num_examples: 4
download_size: 196994
dataset_size: 212085.0
---
# Dataset Card for "boxes_test_controlnet_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Liberty-L/race_train | ---
dataset_info:
features:
- name: data_index_by_user
dtype: int64
- name: article
dtype: string
- name: answer
dtype: string
- name: question
dtype: string
- name: options
sequence: string
- name: input_ids
sequence:
sequence: int32
- name: token_type_ids
sequence:
sequence: int8
- name: attention_mask
sequence:
sequence: int8
- name: label
dtype: int64
splits:
- name: train
num_bytes: 170857353.97264022
num_examples: 15716
download_size: 55886476
dataset_size: 170857353.97264022
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_uukuguy__speechless-coder-ds-6.7b | ---
pretty_name: Evaluation run of uukuguy/speechless-coder-ds-6.7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/speechless-coder-ds-6.7b](https://huggingface.co/uukuguy/speechless-coder-ds-6.7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-coder-ds-6.7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-30T07:08:30.796108](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-coder-ds-6.7b/blob/main/results_2023-12-30T07-08-30.796108.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.38073989952019327,\n\
\ \"acc_stderr\": 0.03433559818958823,\n \"acc_norm\": 0.38307431216916843,\n\
\ \"acc_norm_stderr\": 0.0350891686808636,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766373,\n \"mc2\": 0.4167302788975791,\n\
\ \"mc2_stderr\": 0.014552137962691033\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3378839590443686,\n \"acc_stderr\": 0.013822047922283516,\n\
\ \"acc_norm\": 0.36860068259385664,\n \"acc_norm_stderr\": 0.014097810678042185\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.40300736904999,\n \
\ \"acc_stderr\": 0.0048949977367190485,\n \"acc_norm\": 0.5245966938856802,\n\
\ \"acc_norm_stderr\": 0.004983740145218606\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34074074074074073,\n\
\ \"acc_stderr\": 0.04094376269996794,\n \"acc_norm\": 0.34074074074074073,\n\
\ \"acc_norm_stderr\": 0.04094376269996794\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.03782728980865469,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.03782728980865469\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.39,\n\
\ \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.42641509433962266,\n \"acc_stderr\": 0.030437794342983045,\n\
\ \"acc_norm\": 0.42641509433962266,\n \"acc_norm_stderr\": 0.030437794342983045\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3541666666666667,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.3541666666666667,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3699421965317919,\n\
\ \"acc_stderr\": 0.036812296333943194,\n \"acc_norm\": 0.3699421965317919,\n\
\ \"acc_norm_stderr\": 0.036812296333943194\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.03196758697835361,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.03196758697835361\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.0236369759961018,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.0236369759961018\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4161290322580645,\n\
\ \"acc_stderr\": 0.028040981380761543,\n \"acc_norm\": 0.4161290322580645,\n\
\ \"acc_norm_stderr\": 0.028040981380761543\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694436,\n\
\ \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694436\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03681050869161549,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03681050869161549\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.42424242424242425,\n \"acc_stderr\": 0.03521224908841583,\n \"\
acc_norm\": 0.42424242424242425,\n \"acc_norm_stderr\": 0.03521224908841583\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.39378238341968913,\n \"acc_stderr\": 0.03526077095548237,\n\
\ \"acc_norm\": 0.39378238341968913,\n \"acc_norm_stderr\": 0.03526077095548237\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.35384615384615387,\n \"acc_stderr\": 0.024243783994062164,\n\
\ \"acc_norm\": 0.35384615384615387,\n \"acc_norm_stderr\": 0.024243783994062164\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3277310924369748,\n \"acc_stderr\": 0.030489911417673227,\n\
\ \"acc_norm\": 0.3277310924369748,\n \"acc_norm_stderr\": 0.030489911417673227\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.03734535676787198,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.03734535676787198\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3577981651376147,\n \"acc_stderr\": 0.02055206078482782,\n \"\
acc_norm\": 0.3577981651376147,\n \"acc_norm_stderr\": 0.02055206078482782\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4027777777777778,\n \"acc_stderr\": 0.03344887382997867,\n \"\
acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.03344887382997867\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.37254901960784315,\n \"acc_stderr\": 0.03393388584958406,\n \"\
acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.03393388584958406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3459915611814346,\n \"acc_stderr\": 0.03096481058878671,\n \
\ \"acc_norm\": 0.3459915611814346,\n \"acc_norm_stderr\": 0.03096481058878671\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.35874439461883406,\n\
\ \"acc_stderr\": 0.03219079200419995,\n \"acc_norm\": 0.35874439461883406,\n\
\ \"acc_norm_stderr\": 0.03219079200419995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.48091603053435117,\n \"acc_stderr\": 0.04382094705550989,\n\
\ \"acc_norm\": 0.48091603053435117,\n \"acc_norm_stderr\": 0.04382094705550989\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4380165289256198,\n \"acc_stderr\": 0.045291468044357915,\n \"\
acc_norm\": 0.4380165289256198,\n \"acc_norm_stderr\": 0.045291468044357915\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3611111111111111,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.3611111111111111,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.44171779141104295,\n \"acc_stderr\": 0.039015918258361836,\n\
\ \"acc_norm\": 0.44171779141104295,\n \"acc_norm_stderr\": 0.039015918258361836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.39805825242718446,\n \"acc_stderr\": 0.04846748253977239,\n\
\ \"acc_norm\": 0.39805825242718446,\n \"acc_norm_stderr\": 0.04846748253977239\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5897435897435898,\n\
\ \"acc_stderr\": 0.03222414045241108,\n \"acc_norm\": 0.5897435897435898,\n\
\ \"acc_norm_stderr\": 0.03222414045241108\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.40485312899106,\n\
\ \"acc_stderr\": 0.017553246467720253,\n \"acc_norm\": 0.40485312899106,\n\
\ \"acc_norm_stderr\": 0.017553246467720253\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3872832369942196,\n \"acc_stderr\": 0.02622615860512465,\n\
\ \"acc_norm\": 0.3872832369942196,\n \"acc_norm_stderr\": 0.02622615860512465\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2871508379888268,\n\
\ \"acc_stderr\": 0.015131608849963729,\n \"acc_norm\": 0.2871508379888268,\n\
\ \"acc_norm_stderr\": 0.015131608849963729\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.02791405551046802,\n\
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02791405551046802\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3729903536977492,\n\
\ \"acc_stderr\": 0.027466610213140105,\n \"acc_norm\": 0.3729903536977492,\n\
\ \"acc_norm_stderr\": 0.027466610213140105\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.32407407407407407,\n \"acc_stderr\": 0.026041766202717167,\n\
\ \"acc_norm\": 0.32407407407407407,\n \"acc_norm_stderr\": 0.026041766202717167\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.29432624113475175,\n \"acc_stderr\": 0.0271871270115038,\n \
\ \"acc_norm\": 0.29432624113475175,\n \"acc_norm_stderr\": 0.0271871270115038\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.30182529335071706,\n\
\ \"acc_stderr\": 0.01172435051810589,\n \"acc_norm\": 0.30182529335071706,\n\
\ \"acc_norm_stderr\": 0.01172435051810589\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.028418208619406787,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.028418208619406787\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3088235294117647,\n \"acc_stderr\": 0.01869085027359528,\n \
\ \"acc_norm\": 0.3088235294117647,\n \"acc_norm_stderr\": 0.01869085027359528\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4326530612244898,\n \"acc_stderr\": 0.031717528240626645,\n\
\ \"acc_norm\": 0.4326530612244898,\n \"acc_norm_stderr\": 0.031717528240626645\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.36318407960199006,\n\
\ \"acc_stderr\": 0.034005985055990146,\n \"acc_norm\": 0.36318407960199006,\n\
\ \"acc_norm_stderr\": 0.034005985055990146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.36257309941520466,\n \"acc_stderr\": 0.036871306155620606,\n\
\ \"acc_norm\": 0.36257309941520466,\n \"acc_norm_stderr\": 0.036871306155620606\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766373,\n \"mc2\": 0.4167302788975791,\n\
\ \"mc2_stderr\": 0.014552137962691033\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5887924230465666,\n \"acc_stderr\": 0.013829128358676876\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.18726307808946172,\n \
\ \"acc_stderr\": 0.010745914199510825\n }\n}\n```"
repo_url: https://huggingface.co/uukuguy/speechless-coder-ds-6.7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|arc:challenge|25_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|gsm8k|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hellaswag|10_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T07-08-30.796108.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T07-08-30.796108.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- '**/details_harness|winogrande|5_2023-12-30T07-08-30.796108.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-30T07-08-30.796108.parquet'
- config_name: results
data_files:
- split: 2023_12_30T07_08_30.796108
path:
- results_2023-12-30T07-08-30.796108.parquet
- split: latest
path:
- results_2023-12-30T07-08-30.796108.parquet
---
# Dataset Card for Evaluation run of uukuguy/speechless-coder-ds-6.7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [uukuguy/speechless-coder-ds-6.7b](https://huggingface.co/uukuguy/speechless-coder-ds-6.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-coder-ds-6.7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T07:08:30.796108](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-coder-ds-6.7b/blob/main/results_2023-12-30T07-08-30.796108.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.38073989952019327,
"acc_stderr": 0.03433559818958823,
"acc_norm": 0.38307431216916843,
"acc_norm_stderr": 0.0350891686808636,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766373,
"mc2": 0.4167302788975791,
"mc2_stderr": 0.014552137962691033
},
"harness|arc:challenge|25": {
"acc": 0.3378839590443686,
"acc_stderr": 0.013822047922283516,
"acc_norm": 0.36860068259385664,
"acc_norm_stderr": 0.014097810678042185
},
"harness|hellaswag|10": {
"acc": 0.40300736904999,
"acc_stderr": 0.0048949977367190485,
"acc_norm": 0.5245966938856802,
"acc_norm_stderr": 0.004983740145218606
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.04094376269996794,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.04094376269996794
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.03782728980865469,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.03782728980865469
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.42641509433962266,
"acc_stderr": 0.030437794342983045,
"acc_norm": 0.42641509433962266,
"acc_norm_stderr": 0.030437794342983045
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3541666666666667,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.3541666666666667,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3699421965317919,
"acc_stderr": 0.036812296333943194,
"acc_norm": 0.3699421965317919,
"acc_norm_stderr": 0.036812296333943194
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.03196758697835361,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.03196758697835361
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.0236369759961018,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.0236369759961018
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871136,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871136
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4161290322580645,
"acc_stderr": 0.028040981380761543,
"acc_norm": 0.4161290322580645,
"acc_norm_stderr": 0.028040981380761543
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694436,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694436
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03681050869161549,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03681050869161549
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.42424242424242425,
"acc_stderr": 0.03521224908841583,
"acc_norm": 0.42424242424242425,
"acc_norm_stderr": 0.03521224908841583
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.39378238341968913,
"acc_stderr": 0.03526077095548237,
"acc_norm": 0.39378238341968913,
"acc_norm_stderr": 0.03526077095548237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.35384615384615387,
"acc_stderr": 0.024243783994062164,
"acc_norm": 0.35384615384615387,
"acc_norm_stderr": 0.024243783994062164
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3277310924369748,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.3277310924369748,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.03734535676787198,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.03734535676787198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3577981651376147,
"acc_stderr": 0.02055206078482782,
"acc_norm": 0.3577981651376147,
"acc_norm_stderr": 0.02055206078482782
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.03344887382997867,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.03344887382997867
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.03393388584958406,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.03393388584958406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3459915611814346,
"acc_stderr": 0.03096481058878671,
"acc_norm": 0.3459915611814346,
"acc_norm_stderr": 0.03096481058878671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.35874439461883406,
"acc_stderr": 0.03219079200419995,
"acc_norm": 0.35874439461883406,
"acc_norm_stderr": 0.03219079200419995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48091603053435117,
"acc_stderr": 0.04382094705550989,
"acc_norm": 0.48091603053435117,
"acc_norm_stderr": 0.04382094705550989
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4380165289256198,
"acc_stderr": 0.045291468044357915,
"acc_norm": 0.4380165289256198,
"acc_norm_stderr": 0.045291468044357915
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.44171779141104295,
"acc_stderr": 0.039015918258361836,
"acc_norm": 0.44171779141104295,
"acc_norm_stderr": 0.039015918258361836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.39805825242718446,
"acc_stderr": 0.04846748253977239,
"acc_norm": 0.39805825242718446,
"acc_norm_stderr": 0.04846748253977239
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.03222414045241108,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.03222414045241108
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.40485312899106,
"acc_stderr": 0.017553246467720253,
"acc_norm": 0.40485312899106,
"acc_norm_stderr": 0.017553246467720253
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3872832369942196,
"acc_stderr": 0.02622615860512465,
"acc_norm": 0.3872832369942196,
"acc_norm_stderr": 0.02622615860512465
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2871508379888268,
"acc_stderr": 0.015131608849963729,
"acc_norm": 0.2871508379888268,
"acc_norm_stderr": 0.015131608849963729
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02791405551046802,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02791405551046802
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3729903536977492,
"acc_stderr": 0.027466610213140105,
"acc_norm": 0.3729903536977492,
"acc_norm_stderr": 0.027466610213140105
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.026041766202717167,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.026041766202717167
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.29432624113475175,
"acc_stderr": 0.0271871270115038,
"acc_norm": 0.29432624113475175,
"acc_norm_stderr": 0.0271871270115038
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.30182529335071706,
"acc_stderr": 0.01172435051810589,
"acc_norm": 0.30182529335071706,
"acc_norm_stderr": 0.01172435051810589
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.028418208619406787,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.028418208619406787
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3088235294117647,
"acc_stderr": 0.01869085027359528,
"acc_norm": 0.3088235294117647,
"acc_norm_stderr": 0.01869085027359528
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4326530612244898,
"acc_stderr": 0.031717528240626645,
"acc_norm": 0.4326530612244898,
"acc_norm_stderr": 0.031717528240626645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.36318407960199006,
"acc_stderr": 0.034005985055990146,
"acc_norm": 0.36318407960199006,
"acc_norm_stderr": 0.034005985055990146
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.36257309941520466,
"acc_stderr": 0.036871306155620606,
"acc_norm": 0.36257309941520466,
"acc_norm_stderr": 0.036871306155620606
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766373,
"mc2": 0.4167302788975791,
"mc2_stderr": 0.014552137962691033
},
"harness|winogrande|5": {
"acc": 0.5887924230465666,
"acc_stderr": 0.013829128358676876
},
"harness|gsm8k|5": {
"acc": 0.18726307808946172,
"acc_stderr": 0.010745914199510825
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
pasola/filo | ---
license: unknown
---
|
wbxlala/har3 | ---
dataset_info:
features:
- name: image
sequence:
sequence:
sequence: float64
- name: label
dtype: float64
splits:
- name: test
num_bytes: 13644996
num_examples: 1471
- name: train
num_bytes: 54552156
num_examples: 5881
download_size: 70093717
dataset_size: 68197152
---
# Dataset Card for "har3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
amitkedia/Financial-Fraud-Dataset | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
tags:
- finance
size_categories:
- 1K<n<10K
---
# Dataset Card for Financial Fraud Labeled Dataset
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
This dataset collects financial filings from various companies submitted to the U.S. Securities and Exchange Commission (SEC). The dataset consists of 85 companies involved in fraudulent cases and an equal number of companies not involved in fraudulent activities. The Fillings column includes information such as the company's MD&A, and financial statement over the years the company stated on the SEC website.
This dataset was used for research in detecting financial fraud using multiple LLMs and traditional machine-learning models.
- **Curated by:** [Amit Kedia](https://www.linkedin.com/in/theamitkedia/)
- **Language(s) (NLP):** English
- **License:** Apache 2.0
### Dataset Sources
- **Repository:** [GitHub](https://github.com/amitkedia007/Financial-Fraud-Detection-Using-LLMs)
- **Thesis:** [Financial Fraud Detection using LLMs](https://github.com/amitkedia007/Financial-Fraud-Detection-Using-LLMs/blob/main/Detailed_Report_on_financial_fraud_detection.pdf)
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
#### Code to Directly use the dataset:
from datasets import load_dataset
dataset = load_dataset("amitkedia/Financial-Fraud-Dataset")
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
There are some limitations of the dataset:
1. This dataset is designed for acedemic research
2. The text needs to be cleaned for further process
3. The dataset does not cover all the fradulent cases and are limited to Securities and Exchange Commision of USA (SEC) that means the fradulent and non fradulent cases are the companies of USA
## Dataset Structure
For the structure of the dataset look into the dataset viewer.
## Dataset Creation
Check out the Thesis
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
To help the financial industry develop the best model to detect fraudulent activities which can save billions of dollars for government and banks
#### Data Collection and Processing
Please Refer to the Thesis
## Dataset Card Authors
[Amit Kedia](https://www.linkedin.com/in/theamitkedia/)
|
saier/unarXive_citrec | ---
annotations_creators:
- machine-generated
language:
- en
language_creators:
- found
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
pretty_name: unarXive citation recommendation
size_categories:
- 1M<n<10M
tags:
- arXiv.org
- arXiv
- citation recommendation
- citation
- reference
- publication
- paper
- preprint
- section
- physics
- mathematics
- computer science
- cs
task_categories:
- text-classification
task_ids:
- multi-class-classification
source_datasets:
- extended|10.5281/zenodo.7752615
dataset_info:
features:
- name: _id
dtype: string
- name: text
dtype: string
- name: marker
dtype: string
- name: marker_offsets
sequence:
sequence: int64
- name: label
dtype: string
config_name: .
splits:
- name: train
num_bytes: 5457336094
num_examples: 2043192
- name: test
num_bytes: 551012459
num_examples: 225084
- name: validation
num_bytes: 586422261
num_examples: 225348
download_size: 7005370567
dataset_size: 6594770814
---
# Dataset Card for unarXive citation recommendation
## Dataset Description
* **Homepage:** [https://github.com/IllDepence/unarXive](https://github.com/IllDepence/unarXive)
* **Paper:** [unarXive 2022: All arXiv Publications Pre-Processed for NLP, Including Structured Full-Text and Citation Network](https://arxiv.org/abs/2303.14957)
### Dataset Summary
The unarXive citation recommendation dataset contains 2.5 Million paragraphs from computer science papers and with an annotated citation marker. The paragraphs and citation information is derived from [unarXive](https://github.com/IllDepence/unarXive).
Note that citation infromation is only given as the [OpenAlex](https://openalex.org/) ID of the cited paper. An important consideration for models is therefore if the data is used *as is*, or if additional information of the cited papers (metadata, abstracts, full-text, etc.) is used.
The dataset can be used as follows.
```
from datasets import load_dataset
citrec_data = load_dataset('saier/unarXive_citrec')
citrec_data = citrec_data.class_encode_column('label') # assign target label column
citrec_data = citrec_data.remove_columns('_id') # remove sample ID column
```
## Dataset Structure
### Data Instances
Each data instance contains the paragraph’s text as well as information on one of the contained citation markers, in the form of a label (cited document OpenAlex ID), citation marker, and citation marker offset. An example is shown below.
```
{'_id': '7c1464bb-1f0f-4b38-b1a3-85754eaf6ad1',
'label': 'https://openalex.org/W3115081393',
'marker': '[1]',
'marker_offsets': [[316, 319]],
'text': 'Data: For sentiment analysis on Hindi-English CM tweets, we used the '
'dataset provided by the organizers of Task 9 at SemEval-2020.\n'
'The training dataset consists of 14 thousand tweets.\n'
'Whereas, the validation dataset as well as the test dataset contain '
'3 thousand tweets each.\n'
'The details of the dataset are given in [1]}.\n'
'For this task, we did not use any external dataset.\n'}
```
### Data Splits
The data is split into training, development, and testing data as follows.
* Training: 2,043,192 instances
* Development: 225,084 instances
* Testing: 225,348 instances
## Dataset Creation
### Source Data
The paragraph texts are extracted from the data set [unarXive](https://github.com/IllDepence/unarXive).
#### Who are the source language producers?
The paragraphs were written by the authors of the arXiv papers. In file `license_info.jsonl` author and text licensing information can be found for all samples, An example is shown below.
```
{'authors': 'Yusuke Sekikawa, Teppei Suzuki',
'license': 'http://creativecommons.org/licenses/by/4.0/',
'paper_arxiv_id': '2011.09852',
'sample_ids': ['cc375518-347c-43d0-bfb2-f88564d66df8',
'18dc073e-a48e-488e-b34c-e5fc3cb8a4ca',
'0c2e89b3-d863-4bc2-9e11-8f6c48d867cb',
'd85e46cf-b11d-49b6-801b-089aa2dd037d',
'92915cea-17ab-4a98-aad2-417f6cdd53d2',
'e88cb422-47b7-4f69-9b0b-fbddf8140d98',
'4f5094a4-0e6e-46ae-a34d-e15ce0b9803c',
'59003494-096f-4a7c-ad65-342b74eed561',
'6a99b3f5-217e-4d3d-a770-693483ef8670']}
```
### Annotations
Citation information in unarXive is automatically determined ([see implementation](https://github.com/IllDepence/unarXive/blob/master/src/match_references_openalex.py)).
<!--
## Considerations for Using the Data
### Discussion and Biases
TODO
### Other Known Limitations
TODO
-->
## Additional Information
### Licensing information
The dataset is released under the Creative Commons Attribution-ShareAlike 4.0.
### Citation Information
```
@inproceedings{Saier2023unarXive,
author = {Saier, Tarek and Krause, Johan and F\"{a}rber, Michael},
title = {{unarXive 2022: All arXiv Publications Pre-Processed for NLP, Including Structured Full-Text and Citation Network}},
booktitle = {Proceedings of the 23rd ACM/IEEE Joint Conference on Digital Libraries},
year = {2023},
series = {JCDL '23}
}
```
|
siqideng/proposal_drafter_feedback | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jerrysd/mw2 | ---
license: wtfpl
---
|
Back-up/test_ds_v2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: response
struct:
- name: response
dtype: string
- name: answers
struct:
- name: answer_start
sequence: int64
- name: text
sequence: string
- name: instruction
dtype: string
- name: prompt_name
dtype: string
- name: metadata
struct:
- name: max_ratio
dtype: float64
- name: paragraph_similar
dtype: string
- name: start_index
dtype: int64
splits:
- name: train
num_bytes: 21511872
num_examples: 7597
download_size: 8276932
dataset_size: 21511872
---
# Dataset Card for "test_ds_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.