Dataset Viewer (First 5GB)
Auto-converted to Parquet Duplicate
document_ids
stringlengths
32
32
score_Gemma_Snowflake
float64
-0.31
4.28
score_Llama_Snowflake
float64
-1.09
4.44
score_Mistral_Snowflake
float64
-0.7
4.31
b5b0ee802f912897522909a562c5abb5
2.328125
0.53125
1.53125
4e0008e3f5a532fdaf95c6199d43ba67
3.1875
2.53125
2.90625
997a0277b089e2c6cfd0662356bee2ef
3.203125
3.171875
3.171875
5661ad57cdd85ce8b155fb18c597ed40
2.390625
0.929688
2.3125
4787cd02e33d633395d91d0ebe253b57
2.9375
1.898438
2.765625
7238fe29c1379833dc62cc4fbf0b64f6
2.625
1.4375
2.109375
eeae3444020ec9c045ae4135ef150462
2.328125
0.742188
1.789063
a2afc2c3c6e5e6f4ecc69c02683be2be
2.09375
0.675781
1.179688
04bee44b560c18185473beb1048413fc
2.96875
2.390625
2
588c22206aa3ce824c508663ac3046ce
2.5
1.390625
1.921875
5ad0353a1a072b5af6b2f400224dd84c
2.140625
0.351563
1.46875
a3f5d4db24026eb3308a77da77a98e32
2.703125
1.1875
2.171875
79adcc316f06e402a5fb6db2064f9ef4
2.90625
1.507813
2.375
f25e64131833ae486df122d863f06e7f
1.921875
0.457031
1.070313
f82e58367761337f5edf92bb26a65ee9
2.328125
0.742188
1.8125
9a6f8565da7f5c6bbfd22a2bbfffa667
2.34375
0.470703
1.84375
0ee3e2d8a030aea42ac4e7030fceb5be
2.515625
0.925781
1.734375
a73bcf1299d51f7c857e58d49c56aa64
2.796875
1.710938
2.015625
def6fa1070868a64d54e11c9f2ce955c
2.484375
1.507813
2.265625
e795bbb443d0dd0d9a30772c57a9297d
1.992188
0.273438
1.679688
5a728e6ecef3b76cbc5b774542ba6a44
3.109375
2.375
2.515625
59870c60110dccbd03406bd9cc2e2437
2.296875
0.3125
1.765625
0853ba1f41a4e6d757dc21652d7a5da3
2.75
1.71875
2.515625
902e9cd4e2feb38f4c38decd37fa1d92
2.703125
1.523438
2.4375
f4a7ca70b0aff1e2424f57c4a7a91c44
2.53125
1.257813
2.234375
cbbacae16d63a8947a59230b355805ca
2.390625
0.53125
1.835938
099d8c42f82bbe5b0f2bd235b2fb7f84
2.921875
1.734375
2.53125
d35436d480a28d19a0dea41ddcc0d193
3.34375
2.734375
3.046875
38676ff88bb7ba1be23d7e6148499809
3.265625
2.625
3.265625
45ccb314313bf2a21dda76ff2b49b0e2
2.34375
1.09375
2.1875
9f2f94a12a59a20f4fb58bcbf0843ca7
2.5
0.695313
1.773438
a7fd1bbc90b8befbfe9581601a23e6bf
2.984375
2.234375
3.140625
a9340d4865ac93f6aff809eda5084ab3
2.75
1.570313
2.296875
75945ab0615cad67735f81acba494bf2
2.828125
1.148438
2.28125
433cb802ec8e1155729ca5eb570dc48c
3.140625
1.898438
1.96875
be3fc1c0d7d545d47a7dc14dfb1d0aed
2.5
1.34375
2.375
91c18edd50b35fdf322fade82c400970
2.703125
1.609375
2.65625
d51566c7e8b4f3efb24146fd626dbbde
2.8125
1.804688
2.3125
5fc1a25070d05c30d1f4e7f674b9f6c9
2.578125
1.929688
2.578125
57f707f27fb46a354dfb597eb5407598
2.359375
0.945313
1.984375
b86bef124be99ffc8e2f74137a1e9852
2.34375
0.75
1.671875
368b154557a9857d9183cd4d31ee06a9
2.203125
0.683594
1.492188
ddba449b5ca90fe734ef1c1a9277b7f7
2.1875
0.443359
1.757813
17e5b69eea90c56b7d8fabc4b7b753d9
2.765625
1.523438
2.21875
5838f0e975be56793f5bb0a1106942df
2.703125
1.710938
2.234375
7cbd0a3cb9a6f26ba055e1e68d68660c
2.296875
0.239258
1.976563
d61f8b16aa422c3e08f9c26b879624f6
3.078125
2.171875
2.703125
0b9dd769f472e3ee8f6b2b4dda8917a8
2.78125
1.320313
2.078125
71a8c07102216228b1a3a62e2deec3b9
2.125
0.419922
1.445313
128283511fe5047f5444edc3f69b4d4f
2.671875
1.5625
2.734375
fb17b69542d269f7a2f19ef4649ad2ce
3.046875
2.234375
3.203125
235e377fc7ce776ec678aea9138c9089
2.15625
0.361328
0.972656
c78595104240d969239ab6997d55dffc
2.53125
1.84375
2.5
7cb880df58d2c2db346d0f71a1a71994
2.75
1.390625
2.390625
305f11fffd4c72e4ec1f200622866620
2.46875
1.398438
2.15625
17f22ebb001ea383e51c7653ca58f702
2.484375
1.8125
2.125
3c8f46c730a636293d1b3446d69ba11c
2.828125
2.0625
2.453125
88d4513c834ec48fe1351d8a6f96c87b
2.859375
1.515625
2.421875
2edccee8f9cfd30b762741e7fbbeba10
2.96875
2.171875
2.296875
0c0f2efdf7a6627c4d025b08e9bc422b
3.078125
2.265625
3.25
13135dc7c966686d41f3db46f8ed961e
2.625
1.828125
2.4375
ad8a8ecd982adbf52b0806ed0a2043f1
2.578125
1.382813
2.359375
20ccf1c75914d3cf3458a351f294c0b6
2.578125
1.414063
2.359375
09cac08b73ddc48f8bc2557e9ed90b23
2.59375
2.03125
2.328125
bea3425e58c2eb1e1c40229a95a25f4e
2.75
1.640625
2.484375
c49f3fee2f9eeedf6680ec7177314fdd
2.796875
1.640625
2.421875
d3958cf3d094f8292d704e49a7e4cea5
2.796875
1.773438
2.5
af7e2d65f05bbc39e2d8e2946fbb17c7
2.765625
1.5
2.546875
fdca37b10087d0ff086080743a1001c2
2.71875
1.5625
2.453125
041399924db08750d82232635a5dc189
2.765625
1.65625
2.46875
82e6e98973481d7346f30b0e72a21e27
2.5
1.398438
2.265625
af5ba60c38a5c536470377a07fd66865
2.703125
1.617188
2.390625
25a614d71f3d53c6a00ffd991b7dac48
2.859375
2.1875
2.3125
966b7b177a3a4bfa31aa9fdaef4caf6c
2.703125
1.648438
2.484375
09ae181fb88fcd513dc73c752199fc30
2.5625
1.328125
2.265625
1ef12f30f1a7d25c7a4bc3ac0562c4bf
2.65625
1.53125
2.375
64143870b197c71194f74a5a092e1ec6
2.703125
1.453125
2.375
0b1c1a00098f4387844d1f89ee028c8f
2.5625
1.375
2.25
99c726ea3b2d0e3f19bbd1ebabd17279
2.671875
1.421875
2.296875
dddff43d8c2b72e08f7f44fcfe3f213d
3.234375
2.4375
2.703125
0ea8c948b3049342fd81d5dd0bc515a2
2.484375
1.804688
2.390625
e87208a4dfdc30caa0d5feee466d310b
2.5
1.367188
1.96875
ded1f45a37730db72eb5a0907bf20df8
2.203125
0.259766
1.710938
dfced74970f8150f64ad8327c2efd3ac
3.078125
2.40625
3.078125
6f52813174bd68760c1ef2b0b928019d
1.695313
0.28125
1.359375
90e3024be24c3af41e5bbe48049f2657
2.796875
1.671875
2.515625
3bd09bb2b30e6adf0993a4dcae2ec986
2.5
1.609375
2.34375
1857b22bf5c638d4121600604a5067ae
2.71875
1.226563
2.140625
9080b78f8d12277d01c8374bfcb8495c
3.1875
1.929688
3.265625
38b5bf7aefe65f24b3057e85ac9be760
2.453125
1.046875
2.0625
a375763889591b490cd821a60781063e
2.890625
1.367188
2.265625
d91d3d06c71a3b1e3099480bcce6c0b6
1.984375
0.898438
1.546875
9dceed3c52bb7df8506b17c22aa7d5c9
2.671875
1.65625
2.5625
18ddfbd5dd9c17aae67214d380d6d208
2.765625
1.992188
2.5625
e666cb65ee31dfab8b790d69bb23282d
2.21875
0.625
1.921875
951e5ab17c2a99b9f3f0eb0713259ac0
1.960938
0.703125
1.804688
ca419ddf94d3a647c02c8244c4dc8643
3.171875
2.640625
3.015625
839720631386c7178b32ebe24e4d0238
2.84375
1.390625
2.265625
2efac34c39c2003931e058596f7c7b3e
2.109375
0.640625
1.726563
1e681e0d542e537572e82fb299084ecd
3.125
3.09375
2.65625
End of preview. Expand in Data Studio

HPLT3-Edu-scores

Dataset summary

HPLT3-JQL-Education is a model-annotated language subset of HPLT3, spanning 36 languages. Our model-annotations allow for a filtering that achieves higher-quality training outcomes without excessively aggressive data reduction. HPLT3-Edu-scores was created based on scores assigned by a deep learning classifier trained to identify educational samples using Snowflake's Arctic-embed-m-v2.0 embeddings.

For all training ablations, we used dense decoder-only models with 2 billion parameters, following the LLaMA architecture. For more details, see our paper https://arxiv.org/abs/2505.22232.

The approach as described in the paper is easy to extend to other languages as well, and we might consider adding new languages to an upcoming version of the present dataset.

We also separately release the computed general-purpose embedding vectors for the the full sets of the original HPLT3 dataset, in the respective languages, as they can be useful for other applications beyond quality filtering.

Dataset Structure

Data Fields

Each data entry includes:

  • score_Gemma_Snowflake: Quality score obtained by the Gemma-based Snowflake classifier
  • score_Llama_Snowflake: Quality score obtained by the Llama-based Snowflake classifier
  • score_Mistral_Snowflake: Quality score obtained by the Mistral-based Snowflake classifier
  • document_ids: Original HPLT3 id from the document.

Data Instance

{
  "document_ids": "a4f748036fe464fc123991d1d213f210",
  "score_Gemma_Snowflake": 1.2109375,
  "score_Llama_Snowflake": 0.0849609375,
  "score_Mistral_Snowflake": 0.36328125
}

Origin of the Dataset

This dataset, derived from HPLT3, includes web content collected from 2012 to 2024. As HPLT3 is sourced from the broader internet, it may contain some personally identifiable information (PII), despite efforts to anonymize email addresses and public IP addresses during processing.

Considerations for Data Usage

For information on social impact, potential biases, and known limitations, please refer to the HPLT3 documentation.

Citation information

If you use this dataset in your research or applications, please use the following citation:

@article{ali2025judging,
    title     = {Judging Quality Across Languages: A Multilingual Approach to Pretraining Data Filtering with Language Models},
    author    = {
      Mehdi Ali,
      Manuel Brack,
      Max Lübbering,
      Elias Wendt,
      Abbas Goher Khan,
      Richard Rutmann,
      Alex Jude,
      Maurice Kraus,
      Alexander Arno Weber,
      Felix Stollenwerk,
      David Kaczér,
      Florian Mai,
      Lucie Flek,
      Rafet Sifa,
      Nicolas Flores-Herr,
      Joachim Köhler,
      Patrick Schramowski,
      Michael Fromm,
      Kristian Kersting
    },
    year      = {2025},
    journal   = {arXiv preprint arXiv:2505:22232}
  }
Downloads last month
1,133

Paper for Eurolingua/hplt3_edu_scores