Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

drexalt
/
NeoBERT-RetroMAE-pretrain

Fill-Mask
Transformers
PyTorch
English
neobert
custom_code
Model card Files Files and versions
xet
Community
1

Instructions to use drexalt/NeoBERT-RetroMAE-pretrain with libraries, inference providers, notebooks, and local apps. Follow these links to get started.

  • Libraries
  • Transformers

    How to use drexalt/NeoBERT-RetroMAE-pretrain with Transformers:

    # Use a pipeline as a high-level helper
    from transformers import pipeline
    
    pipe = pipeline("fill-mask", model="drexalt/NeoBERT-RetroMAE-pretrain", trust_remote_code=True)
    # Load model directly
    from transformers import AutoModelForMaskedLM
    model = AutoModelForMaskedLM.from_pretrained("drexalt/NeoBERT-RetroMAE-pretrain", trust_remote_code=True, dtype="auto")
  • Notebooks
  • Google Colab
  • Kaggle
NeoBERT-RetroMAE-pretrain
887 MB
Ctrl+K
Ctrl+K
  • 1 contributor
History: 6 commits
drexalt's picture
drexalt
Update README.md
6188f69 verified 8 months ago
  • .gitattributes
    1.52 kB
    initial commit 8 months ago
  • README.md
    2.69 kB
    Update README.md 8 months ago
  • config.json
    1.55 kB
    Upload NeoBERTLMHead 8 months ago
  • model.py
    15.8 kB
    Upload NeoBERTLMHead 8 months ago
  • pytorch_model.bin

    Detected Pickle imports (3)

    • "collections.OrderedDict",
    • "torch._utils._rebuild_tensor_v2",
    • "torch.FloatStorage"

    What is a pickle import?

    887 MB
    xet
    Upload NeoBERTLMHead 8 months ago
  • rotary.py
    2.58 kB
    Upload NeoBERTLMHead 8 months ago