File size: 5,069 Bytes
bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 9775861 afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 bb4655b afc7e71 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 |
---
library_name: transformers
tags: []
---
# MagicSupport Intent Classifier (BERT Fine-Tuned)
## Overview
This model is a fine-tuned `bert-base-uncased` model for multi-class intent classification in customer support environments.
It is optimized for:
* Fast inference
* High accuracy
* Low deployment cost
* Production-ready intent routing for support systems
The model is designed for the MagicSupport platform but is generalizable to structured customer support intent detection tasks.
---
## Model Details
* Base Model: `bert-base-uncased`
* Architecture: `BertForSequenceClassification`
* Task: Multi-class intent classification
* Number of Intents: 28
* Training Dataset: `bitext/Bitext-customer-support-llm-chatbot-training-dataset`
* Loss: CrossEntropy with class weights
* Framework: Hugging Face Transformers (PyTorch)
---
## Performance
### Validation Metrics (Epoch 5)
* Accuracy: **0.9983**
* F1 Micro: **0.9983**
* F1 Macro: **0.9983**
* Validation Loss: **0.0087**
The model demonstrates strong generalization and stable convergence across 5 epochs.
---
## Example Predictions
| Query | Predicted Intent | Confidence |
| ------------------------------------- | ---------------- | ---------- |
| I want to cancel my order | cancel_order | 0.999 |
| How do I track my shipment | delivery_options | 0.997 |
| I need a refund for my purchase | get_refund | 0.999 |
| I forgot my password | recover_password | 0.999 |
| I have a complaint about your service | complaint | 0.996 |
| hello | FALLBACK | 0.999 |
The model also correctly identifies low-information inputs and maps them to a fallback intent.
---
## Intended Use
This model is intended for:
* Customer support intent classification
* Chatbot routing
* Support ticket categorization
* Voice-to-intent pipelines (after STT)
* Pre-routing before LLM or RAG systems
Typical production flow:
User Query → BERT Intent Classifier → Route to:
* Knowledge Base Retrieval
* Ticketing System
* Escalation to Human
* Fallback LLM
---
## Example Usage
```python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
# Load model and tokenizer from HuggingFace Hub
model_name = "learn-abc/magicSupport-intent-classifier"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSequenceClassification.from_pretrained(model_name)
# Set device
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
model.to(device)
model.eval()
# Prediction function
def predict_intent(text, confidence_threshold=0.75):
inputs = tokenizer(text, return_tensors="pt", truncation=True, max_length=64)
inputs = {k: v.to(device) for k, v in inputs.items()}
with torch.no_grad():
outputs = model(**inputs)
logits = outputs.logits
probs = torch.softmax(logits, dim=-1)
confidence, prediction = torch.max(probs, dim=-1)
predicted_intent = model.config.id2label[prediction.item()]
confidence_score = confidence.item()
# Apply confidence threshold
if confidence_score < confidence_threshold:
predicted_intent = "FALLBACK"
return {
"intent": predicted_intent,
"confidence": confidence_score
}
# Example usage
queries = [
"I want to cancel my order",
"How do I track my package",
"I need a refund",
"hello there"
]
for query in queries:
result = predict_intent(query)
print(f"Query: {query}")
print(f"Intent: {result['intent']}")
print(f"Confidence: {result['confidence']:.3f}\n")
```
---
## Design Decisions
* BERT selected over larger LLMs for:
* Low latency
* Cost efficiency
* Predictable inference
* Edge deployability
* Class weighting applied to mitigate dataset imbalance.
* High confidence outputs indicate strong separation between intent classes.
---
## Known Limitations
* Designed for structured customer support queries.
* May struggle with:
* Highly conversational multi-turn context
* Extremely domain-specific enterprise terminology
* Heavy slang or multilingual input
* Not trained for open-domain conversation.
---
## Future Improvements
* Add MagicSupport real production data for domain adaptation.
* Add hierarchical intent structure.
* Introduce confidence threshold calibration.
* Add OOD (Out-of-Distribution) detection.
* Quantized inference version for edge deployment.
---
## License
Specify your intended license here (e.g., MIT, Apache-2.0).
---
## Citation
If using this model in research or production, please cite appropriately.
---
## Model Card Author
For any inquiries or support, please reach out to:
* **Author:** [Abhishek Singh](https://github.com/SinghIsWriting/)
* **LinkedIn:** [My LinkedIn Profile](https://www.linkedin.com/in/abhishek-singh-bba2662a9)
* **Portfolio:** [Abhishek Singh Portfolio](https://portfolio-abhishek-singh-nine.vercel.app/)
|