File size: 668 Bytes
e0fd5a6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
---
language: multilingual
license: apache-2.0
tags:
- bert
- modernbert
- flexbert
- masked-language-modeling
---

# Bert

This model is a ModernBERT/FlexBERT checkpoint uploaded during training.

## Training Information

- **Step**: 2001
- **Epoch**: 0
- **Samples Seen**: 1070555

## Metrics



## Model Architecture

This model uses the FlexBERT architecture with modern improvements over traditional BERT.

## Usage

```python
from transformers import AutoModel, AutoTokenizer

model = AutoModel.from_pretrained("QuangDuy/Bert")
tokenizer = AutoTokenizer.from_pretrained("QuangDuy/Bert")
```

## Citation

If you use this model, please cite the ModernBERT paper.