Add comprehensive model card for SLModel (MLMvsCLM paper)
#1
by
nielsr
HF Staff
- opened
This PR adds a comprehensive model card for the SLModel artifact, which is associated with the paper "Should We Still Pretrain Encoders with Masked Language Modeling?".
The model card includes:
- The full abstract of the paper.
- Links to the paper, the associated project page (https://hf.co/MLMvsCLM), and the GitHub repository (https://github.com/Nicolas-BZRD/EuroBERT) which hosts the training library.
- Appropriate metadata:
pipeline_tag: feature-extraction,library_name: transformers,license: apache-2.0, and relevanttagsfor better discoverability. - A clear sample usage code snippet for feature extraction using the
transformerslibrary. - A BibTeX citation for the paper.
This update significantly improves the documentation and usability of the model on the Hugging Face Hub.