Add comprehensive model card

#1
by nielsr HF Staff - opened

This PR adds a comprehensive model card for the model in this repository.
It includes:

  • Metadata: pipeline_tag: feature-extraction, library_name: transformers, and license: apache-2.0, along with descriptive tags. This ensures better discoverability and integration within the Hugging Face Hub.
  • Links: To the associated paper (Should We Still Pretrain Encoders with Masked Language Modeling?), the project page (https://hf.co/MLMvsCLM), and the GitHub repository for the Optimus training library (https://github.com/Nicolas-BZRD/EuroBERT).
  • Model Description: A summary of the paper's key findings regarding MLM, CLM, and biphasic pretraining strategies for encoders.
  • Sample Usage: A clear Python code snippet demonstrating how to use the model with the transformers library for feature extraction, including the necessary trust_remote_code=True parameter.
  • Citation: The BibTeX entry for the paper.
Cannot merge
This branch has merge conflicts in the following files:
  • README.md

Sign up or log in to comment