foxy-nlp-br
This model is a fine-tuned version of neuralmind/bert-large-portuguese-cased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.1638
- Accuracy: 0.9714
- F1 Weighted: 0.9714
- F1 Macro: 0.9740
- F1 Saudacao: 1.0
- F1 Cancelamento: 1.0
- F1 Reclamacao: 0.8824
- F1 Financeiro: 1.0
- F1 Suporte Tecnico: 0.9545
- F1 Elogio: 0.9412
- F1 Informacao: 1.0
- F1 Pedido Entrega: 1.0
- F1 Conta Perfil: 0.9615
- F1 Negociacao Retencao: 1.0
- F1 Min Class: 0.8824
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4.197342969974621e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 0.0022118889184775006
- num_epochs: 22
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 Weighted | F1 Macro | F1 Saudacao | F1 Cancelamento | F1 Reclamacao | F1 Financeiro | F1 Suporte Tecnico | F1 Elogio | F1 Informacao | F1 Pedido Entrega | F1 Conta Perfil | F1 Negociacao Retencao | F1 Min Class |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0.8071 | 1.0 | 125 | 0.8255 | 0.7943 | 0.7724 | 0.7549 | 0.8966 | 0.7059 | 0.7692 | 0.7619 | 0.8261 | 0.9143 | 0.75 | 0.9375 | 0.8333 | 0.1538 | 0.1538 |
| 0.2542 | 2.0 | 250 | 0.3204 | 0.9257 | 0.9254 | 0.9301 | 0.9630 | 0.9714 | 0.8824 | 1.0 | 0.8837 | 0.9412 | 0.8421 | 0.9375 | 0.9231 | 0.9565 | 0.8421 |
| 0.0880 | 3.0 | 375 | 0.2739 | 0.9314 | 0.9306 | 0.9334 | 0.9286 | 1.0 | 0.8824 | 1.0 | 0.9048 | 0.9412 | 0.8571 | 0.9375 | 0.9259 | 0.9565 | 0.8571 |
| 0.0176 | 4.0 | 500 | 0.2407 | 0.9486 | 0.9482 | 0.9519 | 1.0 | 1.0 | 0.8824 | 1.0 | 0.9048 | 0.9412 | 0.9231 | 0.9677 | 0.9434 | 0.9565 | 0.8824 |
| 0.0129 | 5.0 | 625 | 0.2209 | 0.9543 | 0.9541 | 0.9563 | 1.0 | 1.0 | 0.8824 | 1.0 | 0.9302 | 0.9412 | 0.9231 | 0.9677 | 0.9615 | 0.9565 | 0.8824 |
| 0.0072 | 6.0 | 750 | 0.2345 | 0.9543 | 0.9541 | 0.9569 | 1.0 | 1.0 | 0.8824 | 1.0 | 0.9302 | 0.9412 | 0.9474 | 0.9677 | 0.9434 | 0.9565 | 0.8824 |
| 0.0085 | 7.0 | 875 | 0.2161 | 0.96 | 0.9599 | 0.9638 | 1.0 | 1.0 | 0.8824 | 1.0 | 0.9302 | 0.9412 | 0.9730 | 0.9677 | 0.9434 | 1.0 | 0.8824 |
| 0.0061 | 8.0 | 1000 | 0.2211 | 0.96 | 0.9599 | 0.9638 | 1.0 | 1.0 | 0.8824 | 1.0 | 0.9302 | 0.9412 | 0.9730 | 0.9677 | 0.9434 | 1.0 | 0.8824 |
| 0.0076 | 9.0 | 1125 | 0.1966 | 0.96 | 0.9599 | 0.9638 | 1.0 | 1.0 | 0.8824 | 1.0 | 0.9302 | 0.9412 | 0.9730 | 0.9677 | 0.9434 | 1.0 | 0.8824 |
| 0.0053 | 10.0 | 1250 | 0.1642 | 0.9714 | 0.9714 | 0.9740 | 1.0 | 1.0 | 0.8824 | 1.0 | 0.9545 | 0.9412 | 1.0 | 1.0 | 0.9615 | 1.0 | 0.8824 |
| 0.0069 | 11.0 | 1375 | 0.1517 | 0.9657 | 0.9657 | 0.9680 | 1.0 | 1.0 | 0.8824 | 1.0 | 0.9545 | 0.9412 | 0.9730 | 0.9677 | 0.9615 | 1.0 | 0.8824 |
| 0.0049 | 12.0 | 1500 | 0.1775 | 0.96 | 0.9599 | 0.9638 | 1.0 | 1.0 | 0.8824 | 1.0 | 0.9302 | 0.9412 | 0.9730 | 0.9677 | 0.9434 | 1.0 | 0.8824 |
| 0.0051 | 13.0 | 1625 | 0.1730 | 0.9657 | 0.9657 | 0.9680 | 1.0 | 1.0 | 0.8824 | 1.0 | 0.9545 | 0.9412 | 0.9730 | 0.9677 | 0.9615 | 1.0 | 0.8824 |
Framework versions
- Transformers 5.3.0
- Pytorch 2.10.0+cu128
- Datasets 3.0.0
- Tokenizers 0.22.2
- Downloads last month
- 339