File size: 2,274 Bytes
16a8e8f f78e542 48b3100 16a8e8f 48b3100 16a8e8f 48b3100 16a8e8f 963555d 16a8e8f d777d9a b8d0a41 3697f67 fae396a b0fa430 ad7fcf6 6462734 55f86ff b28aaba e225380 e93145f 453ac3d f22864d 1782b6e 76acea3 a2c412a 00b70fb ce7f703 493dd83 afb76e8 cb54fca 9749591 c4b4e39 51fb5f6 963555d 16a8e8f | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 | ---
license: apache-2.0
base_model: t5-small
tags:
- generated_from_keras_callback
model-index:
- name: dyu-fr-t5-small
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# dyu-fr-t5-small
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 2.7460
- Validation Loss: 2.7296
- Epoch: 24
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 3.6901 | 3.2495 | 0 |
| 3.4675 | 3.1435 | 1 |
| 3.3768 | 3.0836 | 2 |
| 3.3150 | 3.0325 | 3 |
| 3.2568 | 2.9926 | 4 |
| 3.2142 | 2.9639 | 5 |
| 3.1743 | 2.9341 | 6 |
| 3.1376 | 2.9166 | 7 |
| 3.1019 | 2.8968 | 8 |
| 3.0716 | 2.8781 | 9 |
| 3.0480 | 2.8647 | 10 |
| 3.0187 | 2.8465 | 11 |
| 2.9931 | 2.8347 | 12 |
| 2.9661 | 2.8247 | 13 |
| 2.9482 | 2.8123 | 14 |
| 2.9176 | 2.7992 | 15 |
| 2.9023 | 2.7922 | 16 |
| 2.8792 | 2.7826 | 17 |
| 2.8600 | 2.7731 | 18 |
| 2.8422 | 2.7683 | 19 |
| 2.8236 | 2.7567 | 20 |
| 2.7979 | 2.7493 | 21 |
| 2.7856 | 2.7438 | 22 |
| 2.7681 | 2.7346 | 23 |
| 2.7460 | 2.7296 | 24 |
### Framework versions
- Transformers 4.38.2
- TensorFlow 2.16.1
- Datasets 2.18.0
- Tokenizers 0.15.2
|