| Ar-Mulitlingual-MiniLM | |
| This model is a fine-tuned version of microsoft/Multilingual-MiniLM-L12-H384 on an unknown dataset. | |
| Model description | |
| More information needed | |
| Intended uses & limitations | |
| More information needed | |
| Training and evaluation data | |
| More information needed | |
| Training procedure | |
| Training hyperparameters | |
| The following hyperparameters were used during training: | |
| learning_rate: 5e-05 | |
| train_batch_size: 24 | |
| eval_batch_size: 8 | |
| seed: 42 | |
| optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 | |
| lr_scheduler_type: linear | |
| num_epochs: 2 | |
| mixed_precision_training: Native AMP | |
| Training results | |
| Framework versions | |
| Transformers 4.18.0 | |
| Pytorch 1.11.0+cu113 | |
| Tokenizers 0.12.1 |