sidde's picture
Upload README.md with huggingface_hub
e54703b verified
metadata
language: en
license: apache-2.0
tags:
  - text-classification
  - intent-detection
  - distilbert
  - nlu
datasets:
  - custom
metrics:
  - accuracy
  - f1
pipeline_tag: text-classification

DistilBERT NLU Intent Classification

Fine-tuned DistilBERT model for intent classification in Natural Language Understanding (NLU) systems.

Model Details

  • Base Model: distilbert-base-uncased
  • Task: Intent Classification (Sequence Classification)
  • Number of Labels: 8
  • Framework: PyTorch + Transformers

Supported Intents

ID Intent Description
0 BILLING_ISSUE Problems with bills or charges
1 CANCEL_SUBSCRIPTION Cancel service requests
2 CHECK_BALANCE Balance inquiry
3 GOODBYE Farewell messages
4 GREETING Hello/welcome messages
5 MODIFY_PROFILE Update account details
6 ROAMING_ACTIVATION Enable roaming
7 ROAMING_DEACTIVATION Disable roaming

Usage

from transformers import pipeline

classifier = pipeline("text-classification", model="sidde/distilbert-nlu-intent-classification")

# Single prediction
result = classifier("I want to check my balance")
print(result)
# [{"label": "CHECK_BALANCE", "score": 0.98}]

Training Details

  • Dataset: 772 examples (custom intent dataset)
  • Train/Eval Split: 80/20 with stratification
  • Epochs: 10
  • Batch Size: 16
  • Learning Rate: 2e-5
  • Hardware: NVIDIA L4 GPU on OpenShift AI

Deployment

This model is deployed on OpenShift AI using KServe.

License

Apache 2.0