|
|
--- |
|
|
library_name: transformers |
|
|
language: en |
|
|
license: apache-2.0 |
|
|
datasets: [] |
|
|
tags: [] |
|
|
--- |
|
|
|
|
|
# Model Card for <Model> |
|
|
|
|
|
A pretrained BERT using <Dataset>. |
|
|
|
|
|
## Model Details |
|
|
|
|
|
### Model Description |
|
|
|
|
|
A MLM-only pretrained BERT-base using <Dataset>. |
|
|
|
|
|
- **Developed by:** [Cesar Gonzalez-Gutierrez](https://ceguel.es) |
|
|
- **Funded by:** [ERC](https://erc.europa.eu) |
|
|
- **Model type:** MLM pretrained BERT |
|
|
- **Language(s) (NLP):** English |
|
|
- **License:** Apache license 2.0 |
|
|
- **Pretrained from model:** [BERT base model (uncased)](https://huggingface.co/google-bert/bert-base-uncased) |
|
|
|
|
|
### Model Checkpoints |
|
|
|
|
|
[More Information Needed] |
|
|
|
|
|
### Model Sources |
|
|
|
|
|
- **Paper:** [More Information Needed] |
|
|
|
|
|
## Uses |
|
|
|
|
|
See <https://huggingface.co/google-bert/bert-base-uncased#intended-uses--limitations>. |
|
|
|
|
|
### Checkpoint Use |
|
|
|
|
|
[More Information Needed] |
|
|
|
|
|
## Bias, Risks, and Limitations |
|
|
|
|
|
See <https://huggingface.co/google-bert/bert-base-uncased#limitations-and-bias>. |
|
|
|
|
|
## Training Details |
|
|
|
|
|
See <https://huggingface.co/google-bert/bert-base-uncased#training-procedure>. |
|
|
|
|
|
### Training Data |
|
|
|
|
|
[More Information Needed] |
|
|
|
|
|
#### Preprocessing [optional] |
|
|
|
|
|
[More Information Needed] |
|
|
|
|
|
#### Training Hyperparameters |
|
|
|
|
|
- **Training regime:** fp16 |
|
|
- **Batch size:** 32 |
|
|
- **Gradient accumulation steps:** 3 |
|
|
|
|
|
## Environmental Impact |
|
|
|
|
|
- **Hardware Type:** NVIDIA Tesla V100 PCIE 32GB |
|
|
- **Hours used:** [More Information Needed] |
|
|
- **Cluster Provider:** [Artemisa](https://artemisa.ific.uv.es/web/) |
|
|
- **Compute Region:** EU |
|
|
- **Carbon Emitted:** [More Information Needed] <!-- Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). --> |
|
|
|
|
|
## Citation |
|
|
|
|
|
**BibTeX:** |
|
|
|
|
|
[More Information Needed] |
|
|
|