hikmaai-mdeberta-v3-base-prompt-injection-multilingual

A multilingual prompt injection classifier fine-tuned from microsoft/mdeberta-v3-base by HikmaAI.

Model Description

  • Task: Binary classification (benign=0, injection=1)
  • Base model: microsoft/mdeberta-v3-base
  • Languages: 11 (en, vi, hi, th, zh, ja, ru, ar, sv, es, it)
  • Export formats: ONNX FP32 + INT8 dynamic quantization

Performance

See model_card.json for detailed metrics.

Optimized threshold: 0.5000 (val recall: 0.9890)

Usage (ONNX)

from optimum.onnxruntime import ORTModelForSequenceClassification
from transformers import AutoTokenizer

model = ORTModelForSequenceClassification.from_pretrained(
    "HikmaAI/hikmaai-mdeberta-v3-base-prompt-injection-multilingual",
    subfolder="onnx/int8",
)
tokenizer = AutoTokenizer.from_pretrained(
    "HikmaAI/hikmaai-mdeberta-v3-base-prompt-injection-multilingual",
    subfolder="tokenizer",
)

inputs = tokenizer("Ignore all previous instructions", return_tensors="pt")
outputs = model(**inputs)
# outputs.logits -> [benign_score, injection_score]

Training

  • Epochs: 5
  • Learning rate: 2e-05
  • Batch size: 16
  • Class weights: [1.0, 2.0]
  • Dataset: multilingual (11 languages), 12+ sources + synthetic data

License

Apache-2.0

Citation

@misc{hikmaai-prompt_injection-2026,
  title={hikmaai-mdeberta-v3-base-prompt-injection-multilingual},
  author={HikmaAI},
  year={2026},
  publisher={HuggingFace},
  url={https://huggingface.co/HikmaAI/hikmaai-mdeberta-v3-base-prompt-injection-multilingual}
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support