KoModernBERT
Collection
Fine-Tune ModernBERT for Korean Language Processing • 5 items • Updated • 1
How to use x2bee/KoModernBERT-base-mlm-ecs_v03 with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("fill-mask", model="x2bee/KoModernBERT-base-mlm-ecs_v03") # Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("x2bee/KoModernBERT-base-mlm-ecs_v03")
model = AutoModelForMaskedLM.from_pretrained("x2bee/KoModernBERT-base-mlm-ecs_v03")This model is a fine-tuned version of x2bee/KoModernBERT-base-mlm-ecs_v01 on the None dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 4.7096 | 0.8753 | 250 | 0.2983 |
| 4.3635 | 1.7527 | 500 | 0.2827 |