| datasets: | |
| - ner_distillbert | |
| language: | |
| - en | |
| library_name: pytorch | |
| license: apache-2.0 | |
| tags: | |
| - ner | |
| - pytorch | |
| - mlflow | |
| task: token-classification | |
| # ner_distillbert | |
| ## Model Description | |
| This is a Named Entity Recognition (NER) model based on DistilBERT, a distilled version of BERT that retains 97% of BERT's performance while being 60% smaller and faster. The model identifies and classifies named entities in text such as persons, organizations, locations, and other predefined categories. | |
| **Model Details:** | |
| - **Model Name:** ner_distillbert | |
| - **Version:** 1 | |
| - **Task:** Ner | |
| - **Framework:** pytorch | |
| - **Language(s):** en | |
| - **License:** apache-2.0 | |
| ## Intended Uses | |
| This model is designed for ner tasks. Please evaluate on your specific use case before production deployment. | |
| ## Training Details | |
| ### Training Data | |
| - **Dataset:** ner_distillbert | |
| - **Dataset Size:** Not specified | |
| ### Training Configuration | |
| ## Usage | |
| ### Installation | |
| ```bash | |
| pip install transformers torch # For transformers models | |
| # OR | |
| pip install -r requirements.txt # For other frameworks | |
| ``` | |
| ### Basic Usage | |
| ```python | |
| # Load and use the model | |
| from inference import Model # See inference.py in the repository | |
| model = Model(openchs/ner_distillbert_v1) | |
| predictions = model.predict([President Biden met with Chancellor Merkel at the White House to discuss NATO policies.]) | |
| print(predictions) | |
| ``` | |
| ## Performance Metrics | |
| ### Evaluation Results | |
| | Metric | Value | | |
| |--------|-------| | |
| | Epoch | 3.0000 | | |
| | Eval Accuracy | 0.9608 | | |
| | Eval F1 | 0.9433 | | |
| | Eval Loss | 0.1171 | | |
| | Eval Precision | 0.9324 | | |
| | Eval Recall | 0.9608 | | |
| | Eval Runtime | 0.1943 | | |
| | Eval Samples Per Second | 82.3410 | | |
| | Eval Steps Per Second | 10.2930 | | |
| ## MLflow Tracking | |
| - **Experiment:** N/A | |
| - **Run ID:** `N/A` | |
| - **Training Date:** N/A | |
| ## Citation | |
| ```bibtex | |
| @misc{ner_distillbert_1, | |
| title={ner_distillbert}, | |
| author={BITZ-AI TEAM}, | |
| year={2025}, | |
| publisher={Hugging Face}, | |
| url={https://huggingface.co/marlonbino/ner_distillbert} | |
| } | |
| ``` | |