--- library_name: transformers language: en license: apache-2.0 datasets: [] tags: [] --- # Model Card for A pretrained BERT using . ## Model Details ### Model Description A MLM-only pretrained BERT-base using . - **Developed by:** [Cesar Gonzalez-Gutierrez](https://ceguel.es) - **Funded by:** [ERC](https://erc.europa.eu) - **Model type:** MLM pretrained BERT - **Language(s) (NLP):** English - **License:** Apache license 2.0 - **Pretrained from model:** [BERT base model (uncased)](https://huggingface.co/google-bert/bert-base-uncased) ### Model Checkpoints [More Information Needed] ### Model Sources - **Paper:** [More Information Needed] ## Uses See . ### Checkpoint Use [More Information Needed] ## Bias, Risks, and Limitations See . ## Training Details See . ### Training Data [More Information Needed] #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** fp16 - **Batch size:** 32 - **Gradient accumulation steps:** 3 ## Environmental Impact - **Hardware Type:** NVIDIA Tesla V100 PCIE 32GB - **Hours used:** [More Information Needed] - **Cluster Provider:** [Artemisa](https://artemisa.ific.uv.es/web/) - **Compute Region:** EU - **Carbon Emitted:** [More Information Needed] ## Citation **BibTeX:** [More Information Needed]