nielsr HF Staff commited on
Commit
24935bc
·
verified ·
1 Parent(s): ddbe94f

Add link to code repository

Browse files

This PR improves the model card by adding a direct link to the associated GitHub repository (https://github.com/Nicolas-BZRD/EuroBERT), where the code and project artifacts for this research are likely hosted or generated using the Optimus training library. This addition makes it easier for users to find the codebase.

Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -1,7 +1,7 @@
1
  ---
2
- pipeline_tag: feature-extraction
3
  library_name: transformers
4
  license: apache-2.0
 
5
  ---
6
 
7
  # Overview
@@ -9,6 +9,7 @@ license: apache-2.0
9
  This repository contains an encoder model, part of the research presented in the paper *Should We Still Pretrain Encoders with Masked Language Modeling?* (Gisserot-Boukhlef et al.).
10
 
11
  * **Paper:** [Should We Still Pretrain Encoders with Masked Language Modeling?](https://huggingface.co/papers/2507.00994)
 
12
  * **Blog post:** [Link](https://huggingface.co/blog/Nicolas-BZRD/encoders-should-not-be-only-pre-trained-with-mlm)
13
  * **Project page:** [https://hf.co/MLMvsCLM](https://hf.co/MLMvsCLM)
14
 
 
1
  ---
 
2
  library_name: transformers
3
  license: apache-2.0
4
+ pipeline_tag: feature-extraction
5
  ---
6
 
7
  # Overview
 
9
  This repository contains an encoder model, part of the research presented in the paper *Should We Still Pretrain Encoders with Masked Language Modeling?* (Gisserot-Boukhlef et al.).
10
 
11
  * **Paper:** [Should We Still Pretrain Encoders with Masked Language Modeling?](https://huggingface.co/papers/2507.00994)
12
+ * **Code:** [https://github.com/Nicolas-BZRD/EuroBERT](https://github.com/Nicolas-BZRD/EuroBERT)
13
  * **Blog post:** [Link](https://huggingface.co/blog/Nicolas-BZRD/encoders-should-not-be-only-pre-trained-with-mlm)
14
  * **Project page:** [https://hf.co/MLMvsCLM](https://hf.co/MLMvsCLM)
15