bert_conll2003_ner / README.md
x4n4's picture
Update README.md
3689ed7 verified
metadata
library_name: transformers
tags: []

Model Card for Model ID

BERT NER Model (CoNLL-2003)

πŸ“Œ Overview

This model is a fine-tuned version of bert-base-cased for the task of Named Entity Recognition (NER).

🎯 Task

Token Classification (Named Entity Recognition)

The model identifies the following entity types:

  • PER (Person)
  • ORG (Organization)
  • LOC (Location)
  • MISC (Miscellaneous)

πŸ“Š Dataset

  • CoNLL-2003 (English news dataset)
  • Only NER tags were used (BIO format)

🧠 Model Details

  • Base model: bert-base-cased
  • Architecture: Transformer Encoder (BERT)
  • Fine-tuning: Hugging Face Transformers
  • Training epochs: 3

πŸ“ˆ Performance

Test set results:

  • Precision: ~0.90
  • Recall: ~0.91
  • F1-score: ~0.91
  • Accuracy: ~0.98

βš™οΈ Usage Example

from transformers import pipeline

ner = pipeline(
    "ner",
    model="x4n4/bert-conll2003-ner",
    aggregation_strategy="simple"
)

text = "Barack Obama visited Google in New York"
print(ner(text))