ModernBERT Danish NER (Base) — ONNX INT8

ONNX INT8 dynamically quantized version of thomasbeste/modernbert-da-ner-base.

Quantized with AVX-512 VNNI configuration for fast CPU inference.

Benchmark: DaNE Test Set

Entity Precision Recall F1 Support
PER 0.8962 0.9061 0.9011 181
ORG 0.6929 0.6299 0.6599 154
LOC 0.7500 0.8969 0.8169 97
MISC 0.4878 0.6316 0.5505 95
micro avg 0.7260 0.7742 0.7493

Usage

from optimum.onnxruntime import ORTModelForTokenClassification
from transformers import AutoTokenizer, pipeline

model = ORTModelForTokenClassification.from_pretrained("thomasbeste/modernbert-da-ner-base-onnx-int8")
tokenizer = AutoTokenizer.from_pretrained("thomasbeste/modernbert-da-ner-base-onnx-int8")
ner = pipeline("ner", model=model, tokenizer=tokenizer, aggregation_strategy="simple")

results = ner("Jens Peter Hansen bor i København og arbejder hos Novo Nordisk.")
for entity in results:
    print(f"{entity['word']}: {entity['entity_group']} ({entity['score']:.3f})")

Training Details

See the PyTorch model card: thomasbeste/modernbert-da-ner-base

Downloads last month
7
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train thomasbeste/modernbert-da-ner-base-onnx-int8