| ======================================= | |
| Model Training Summary | |
| ======================================= | |
| Model: prajjwal1/bert-medium | |
| Task: sst2 | |
| Training Date: Sat Sep 6 02:20:18 PM JST 2025 | |
| Hyperparameters: | |
| - Epochs: 10 | |
| - Batch Size: 32 | |
| - Learning Rate: 3e-5 | |
| - Warmup Steps: 400 | |
| - Weight Decay: 0.01 | |
| - Max Sequence Length: 128 | |
| Training Status: SUCCESS | |
| Job: 8/15 | |
| Model Directory: /home/ubuntu/master-research/not-trust-quivalent/models/sst2/prajjwal1-bert-medium | |
| Script Path: /home/ubuntu/master-research/not-trust-quivalent/src/finetune/train.py | |
| Output Directory: /home/ubuntu/master-research/not-trust-quivalent/models | |
| ======================================= | |