Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

answerdotai
/
ModernBERT-large

Fill-Mask
Transformers
PyTorch
ONNX
Safetensors
English
modernbert
masked-lm
long-context
Model card Files Files and versions
xet
Community
14

Instructions to use answerdotai/ModernBERT-large with libraries, inference providers, notebooks, and local apps. Follow these links to get started.

  • Libraries
  • Transformers

    How to use answerdotai/ModernBERT-large with Transformers:

    # Use a pipeline as a high-level helper
    from transformers import pipeline
    
    pipe = pipeline("fill-mask", model="answerdotai/ModernBERT-large")
    # Load model directly
    from transformers import AutoTokenizer, AutoModelForMaskedLM
    
    tokenizer = AutoTokenizer.from_pretrained("answerdotai/ModernBERT-large")
    model = AutoModelForMaskedLM.from_pretrained("answerdotai/ModernBERT-large")
  • Notebooks
  • Google Colab
  • Kaggle
New discussion
Resources
  • PR & discussions documentation
  • Code of Conduct
  • Hub documentation

'save_total_limit' not respected

#14 opened 11 months ago by
enricoburi

update-onnx-model

#13 opened 11 months ago by
kozistr

MTEB results?

πŸ‘ 1
#11 opened over 1 year ago by
antonkulaga

Why add_prefix_space=false?

πŸ‘ 4
#5 opened over 1 year ago by
hankcs

# Fine-tuning ModernBERT on a Large Dataset with Masked Language Modelling

πŸ‘ 5
2
#4 opened over 1 year ago by
ssmits

Error in subprocess: concurrent.futures.process.BrokenProcessPool: A process in the process pool was terminated abruptly while the future was running or pending.

#3 opened over 1 year ago by
BwandoWando

plan on multilingual variant?

βž• 20
3
#2 opened over 1 year ago by
ahxxm
Company
TOS Privacy About Careers
Website
Models Datasets Spaces Pricing Docs