Fineweb Edu ModernBERT Classifiers
Collection
3 items • Updated
How to use mrm8488/ModernBERT-base-ft-fineweb-edu-annotations with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="mrm8488/ModernBERT-base-ft-fineweb-edu-annotations") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("mrm8488/ModernBERT-base-ft-fineweb-edu-annotations")
model = AutoModelForSequenceClassification.from_pretrained("mrm8488/ModernBERT-base-ft-fineweb-edu-annotations")This model is a fine-tuned version of answerdotai/ModernBERT-base on the None dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | F1 Score | Precision Score | Recall Score |
|---|---|---|---|---|---|---|
| 0.6283 | 1.0 | 11686 | 0.5695 | 0.7615 | 0.7666 | 0.7587 |
| 0.4154 | 2.0 | 23372 | 0.5917 | 0.7749 | 0.7840 | 0.7705 |
| 0.1468 | 3.0 | 35058 | 1.1047 | 0.7565 | 0.7603 | 0.7545 |
Base model
answerdotai/ModernBERT-base