| language: multilingual | |
| license: apache-2.0 | |
| tags: | |
| - bert | |
| - modernbert | |
| - flexbert | |
| - masked-language-modeling | |
| # Bert | |
| This model is a ModernBERT/FlexBERT checkpoint uploaded during training. | |
| ## Training Information | |
| - **Step**: 2001 | |
| - **Epoch**: 0 | |
| - **Samples Seen**: 1070555 | |
| ## Metrics | |
| ## Model Architecture | |
| This model uses the FlexBERT architecture with modern improvements over traditional BERT. | |
| ## Usage | |
| ```python | |
| from transformers import AutoModel, AutoTokenizer | |
| model = AutoModel.from_pretrained("QuangDuy/Bert") | |
| tokenizer = AutoTokenizer.from_pretrained("QuangDuy/Bert") | |
| ``` | |
| ## Citation | |
| If you use this model, please cite the ModernBERT paper. | |