metadata
library_name: transformers
language: en
license: apache-2.0
datasets: []
tags: []
Model Card for
A pretrained BERT using .
Model Details
Model Description
A MLM-only pretrained BERT-base using .
- Developed by: Cesar Gonzalez-Gutierrez
- Funded by: ERC
- Model type: MLM pretrained BERT
- Language(s) (NLP): English
- License: Apache license 2.0
- Pretrained from model: BERT base model (uncased)
Model Checkpoints
[More Information Needed]
Model Sources
- Paper: [More Information Needed]
Uses
See https://huggingface.co/google-bert/bert-base-uncased#intended-uses--limitations.
Checkpoint Use
[More Information Needed]
Bias, Risks, and Limitations
See https://huggingface.co/google-bert/bert-base-uncased#limitations-and-bias.
Training Details
See https://huggingface.co/google-bert/bert-base-uncased#training-procedure.
Training Data
[More Information Needed]
Preprocessing [optional]
[More Information Needed]
Training Hyperparameters
- Training regime: fp16
- Batch size: 32
- Gradient accumulation steps: 3
Environmental Impact
- Hardware Type: NVIDIA Tesla V100 PCIE 32GB
- Hours used: [More Information Needed]
- Cluster Provider: Artemisa
- Compute Region: EU
- Carbon Emitted: [More Information Needed]
Citation
BibTeX:
[More Information Needed]