--- library_name: transformers tags: [] --- # Model Card for Model ID # BERT NER Model (CoNLL-2003) ## 📌 Overview This model is a fine-tuned version of `bert-base-cased` for the task of Named Entity Recognition (NER). ## 🎯 Task Token Classification (Named Entity Recognition) The model identifies the following entity types: - PER (Person) - ORG (Organization) - LOC (Location) - MISC (Miscellaneous) ## 📊 Dataset - CoNLL-2003 (English news dataset) - Only NER tags were used (BIO format) ## 🧠 Model Details - Base model: `bert-base-cased` - Architecture: Transformer Encoder (BERT) - Fine-tuning: Hugging Face Transformers - Training epochs: 3 ## 📈 Performance Test set results: - Precision: ~0.90 - Recall: ~0.91 - F1-score: ~0.91 - Accuracy: ~0.98 ## ⚙️ Usage Example ```python from transformers import pipeline ner = pipeline( "ner", model="x4n4/bert-conll2003-ner", aggregation_strategy="simple" ) text = "Barack Obama visited Google in New York" print(ner(text))