--- library_name: transformers license: apache-2.0 base_model: answerdotai/ModernBERT-base tags: - generated_from_trainer model-index: - name: eternis_router_encoder_sft_10Sep results: [] --- # eternis_router_encoder_sft_10Sep This model is a fine-tuned version of [answerdotai/ModernBERT-base](https://huggingface.co/answerdotai/ModernBERT-base) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.6852 - Complexity Accuracy: 0.772 - Model Accuracy: 0.747 - Overall Accuracy: 0.5793 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 64 - optimizer: Use adamw_torch_fused with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.01 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Complexity Accuracy | Model Accuracy | Overall Accuracy | |:-------------:|:------:|:----:|:---------------:|:-------------------:|:--------------:|:----------------:| | 0.8284 | 0.6857 | 300 | 0.7391 | 0.7275 | 0.7475 | 0.5437 | | 0.7657 | 1.3703 | 600 | 0.7173 | 0.7408 | 0.7478 | 0.5515 | | 0.7398 | 2.0549 | 900 | 0.7099 | 0.7502 | 0.748 | 0.5595 | | 0.7161 | 2.7406 | 1200 | 0.7037 | 0.7578 | 0.748 | 0.5645 | | 0.7057 | 3.4251 | 1500 | 0.6973 | 0.7635 | 0.7468 | 0.569 | | 0.7115 | 4.1097 | 1800 | 0.6927 | 0.764 | 0.748 | 0.5705 | | 0.7214 | 4.7954 | 2100 | 0.6896 | 0.7672 | 0.7482 | 0.5755 | | 0.7034 | 5.48 | 2400 | 0.6886 | 0.769 | 0.7472 | 0.5777 | | 0.6935 | 6.1646 | 2700 | 0.6878 | 0.769 | 0.7478 | 0.577 | | 0.7055 | 6.8503 | 3000 | 0.6867 | 0.7722 | 0.7465 | 0.5787 | | 0.6983 | 7.5349 | 3300 | 0.6858 | 0.7728 | 0.7465 | 0.5797 | | 0.7092 | 8.2194 | 3600 | 0.6849 | 0.774 | 0.747 | 0.5803 | | 0.697 | 8.9051 | 3900 | 0.6851 | 0.7718 | 0.747 | 0.5787 | | 0.6989 | 9.5897 | 4200 | 0.6852 | 0.772 | 0.747 | 0.5793 | ### Framework versions - Transformers 4.56.1 - Pytorch 2.8.0+cu128 - Datasets 4.0.0 - Tokenizers 0.22.0