AmazonScience/massive
Viewer • Updated • 2.56M • 10.8k • 87
How to use cartesinus/multilingual_minilm-amazon-massive-intent with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="cartesinus/multilingual_minilm-amazon-massive-intent") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("cartesinus/multilingual_minilm-amazon-massive-intent")
model = AutoModelForSequenceClassification.from_pretrained("cartesinus/multilingual_minilm-amazon-massive-intent")This model is a fine-tuned version of microsoft/Multilingual-MiniLM-L12-H384 on the MASSIVE1.1 dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|---|---|---|---|---|---|
| 3.7961 | 1.0 | 720 | 3.1657 | 0.3404 | 0.3404 |
| 3.1859 | 2.0 | 1440 | 2.4835 | 0.4343 | 0.4343 |
| 2.3104 | 3.0 | 2160 | 2.0474 | 0.5652 | 0.5652 |
| 2.0071 | 4.0 | 2880 | 1.7190 | 0.6503 | 0.6503 |
| 1.5595 | 5.0 | 3600 | 1.4873 | 0.6990 | 0.6990 |
| 1.3664 | 6.0 | 4320 | 1.3088 | 0.7354 | 0.7354 |
| 1.1272 | 7.0 | 5040 | 1.1964 | 0.7521 | 0.7521 |
| 1.0128 | 8.0 | 5760 | 1.1115 | 0.7718 | 0.7718 |
| 0.9405 | 9.0 | 6480 | 1.0598 | 0.7841 | 0.7841 |
| 0.7758 | 10.0 | 7200 | 1.0003 | 0.7944 | 0.7944 |
| 0.7457 | 11.0 | 7920 | 0.9599 | 0.8037 | 0.8037 |
| 0.6605 | 12.0 | 8640 | 0.9175 | 0.8165 | 0.8165 |
| 0.6135 | 13.0 | 9360 | 0.9148 | 0.8190 | 0.8190 |
| 0.5698 | 14.0 | 10080 | 0.8976 | 0.8229 | 0.8229 |
| 0.5578 | 15.0 | 10800 | 0.8941 | 0.8234 | 0.8234 |
Base model
microsoft/Multilingual-MiniLM-L12-H384