Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing
    • Website
      • Tasks
      • HuggingChat
      • Collections
      • Languages
      • Organizations
    • Community
      • Blog
      • Posts
      • Daily Papers
      • Learn
      • Discord
      • Forum
      • GitHub
    • Solutions
      • Team & Enterprise
      • Hugging Face PRO
      • Enterprise Support
      • Inference Providers
      • Inference Endpoints
      • Storage Buckets

  • Log In
  • Sign Up

bertin-project
/
bertin-base-random-exp-512seqlen

Fill-Mask
Transformers
PyTorch
JAX
TensorBoard
Joblib
Safetensors
Spanish
roberta
spanish
Model card Files Files and versions
xet
Metrics Training metrics Community
1

Instructions to use bertin-project/bertin-base-random-exp-512seqlen with libraries, inference providers, notebooks, and local apps. Follow these links to get started.

  • Libraries
  • Transformers

    How to use bertin-project/bertin-base-random-exp-512seqlen with Transformers:

    # Use a pipeline as a high-level helper
    from transformers import pipeline
    
    pipe = pipeline("fill-mask", model="bertin-project/bertin-base-random-exp-512seqlen")
    # Load model directly
    from transformers import AutoTokenizer, AutoModelForMaskedLM
    
    tokenizer = AutoTokenizer.from_pretrained("bertin-project/bertin-base-random-exp-512seqlen")
    model = AutoModelForMaskedLM.from_pretrained("bertin-project/bertin-base-random-exp-512seqlen")
  • Notebooks
  • Google Colab
  • Kaggle
bertin-base-random-exp-512seqlen / outputs /checkpoints
3.75 GB
Ctrl+K
Ctrl+K
  • 4 contributors
History: 3 commits

This model has 1 file scanned as suspicious.

versae's picture
versae
Step... (249001/250000 | Loss: 2.184157609939575, Acc: 0.5907210111618042): 8%|█▉ | 19999/250000 [6:53:56<72:26:45, 1.13s/it]
61bb10e almost 5 years ago
  • checkpoint-245000
    Step... (247001/250000 | Loss: 2.180281400680542, Acc: 0.5911673307418823): 7%|█▋ | 17142/250000 [5:55:00<76:24:14, 1.18s/it] almost 5 years ago
  • checkpoint-246000
    Step... (247001/250000 | Loss: 2.180281400680542, Acc: 0.5911673307418823): 7%|█▋ | 17142/250000 [5:55:00<76:24:14, 1.18s/it] almost 5 years ago
  • checkpoint-247000
    Step... (247001/250000 | Loss: 2.180281400680542, Acc: 0.5911673307418823): 7%|█▋ | 17142/250000 [5:55:00<76:24:14, 1.18s/it] almost 5 years ago
  • checkpoint-248000
    Step... (249001/250000 | Loss: 2.184157609939575, Acc: 0.5907210111618042): 8%|█▉ | 19999/250000 [6:53:56<72:26:45, 1.13s/it] almost 5 years ago
  • checkpoint-249000
    Step... (249001/250000 | Loss: 2.184157609939575, Acc: 0.5907210111618042): 8%|█▉ | 19999/250000 [6:53:56<72:26:45, 1.13s/it] almost 5 years ago