Add comprehensive model card
#1
by
nielsr
HF Staff
- opened
This PR adds a comprehensive model card for the model in this repository.
It includes:
- Metadata:
pipeline_tag: feature-extraction,library_name: transformers, andlicense: apache-2.0, along with descriptive tags. This ensures better discoverability and integration within the Hugging Face Hub. - Links: To the associated paper (Should We Still Pretrain Encoders with Masked Language Modeling?), the project page (https://hf.co/MLMvsCLM), and the GitHub repository for the
Optimustraining library (https://github.com/Nicolas-BZRD/EuroBERT). - Model Description: A summary of the paper's key findings regarding MLM, CLM, and biphasic pretraining strategies for encoders.
- Sample Usage: A clear Python code snippet demonstrating how to use the model with the
transformerslibrary for feature extraction, including the necessarytrust_remote_code=Trueparameter. - Citation: The BibTeX entry for the paper.