apriadiazriel/bert-cased-jnlpba
This model is a fine-tuned version of bert-base-cased on the JNLPBA dataset. It achieves the following results on the evaluation set:
- Train Loss: 0.0851
- Validation Loss: 0.2221
- Precision: 0.6744
- Recall: 0.7808
- F1: 0.7237
- Accuracy: 0.9371
- Epoch: 5
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 5795, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
Training results
| Train Loss | Validation Loss | Precision | Recall | F1 | Accuracy | Epoch |
|---|---|---|---|---|---|---|
| 0.2424 | 0.1998 | 0.6507 | 0.7606 | 0.7014 | 0.9322 | 0 |
| 0.1426 | 0.1975 | 0.6613 | 0.7832 | 0.7171 | 0.9364 | 1 |
| 0.1166 | 0.2051 | 0.6527 | 0.7847 | 0.7127 | 0.9353 | 2 |
| 0.0984 | 0.2108 | 0.6750 | 0.7811 | 0.7242 | 0.9378 | 3 |
| 0.0851 | 0.2221 | 0.6744 | 0.7808 | 0.7237 | 0.9371 | 4 |
Framework versions
- Transformers 4.48.3
- TensorFlow 2.18.0
- Datasets 3.3.2
- Tokenizers 0.21.0
- Downloads last month
- 1
Model tree for apriadiazriel/bert-cased-jnlpba
Base model
google-bert/bert-base-cased