Text Classification
Transformers
PyTorch
TensorBoard
distilbert
Generated from Trainer
text-embeddings-inference
Instructions to use Zain6699/intent-classifier with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Zain6699/intent-classifier with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="Zain6699/intent-classifier")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("Zain6699/intent-classifier") model = AutoModelForSequenceClassification.from_pretrained("Zain6699/intent-classifier") - Notebooks
- Google Colab
- Kaggle
update model card README.md
Browse files
README.md
CHANGED
|
@@ -17,9 +17,9 @@ should probably proofread and complete it, then remove this comment. -->
|
|
| 17 |
|
| 18 |
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
|
| 19 |
It achieves the following results on the evaluation set:
|
| 20 |
-
- Loss: 0.
|
| 21 |
-
- Accuracy: 0.
|
| 22 |
-
- F1: 0.
|
| 23 |
|
| 24 |
## Model description
|
| 25 |
|
|
|
|
| 17 |
|
| 18 |
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
|
| 19 |
It achieves the following results on the evaluation set:
|
| 20 |
+
- Loss: 0.0590
|
| 21 |
+
- Accuracy: 0.9854
|
| 22 |
+
- F1: 0.9586
|
| 23 |
|
| 24 |
## Model description
|
| 25 |
|