Token Classification
Transformers
PyTorch
TensorFlow
JAX
ONNX
Safetensors
English
bert
Eval Results (legacy)
Instructions to use dslim/bert-base-NER with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use dslim/bert-base-NER with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("token-classification", model="dslim/bert-base-NER")# Load model directly from transformers import AutoTokenizer, AutoModelForTokenClassification tokenizer = AutoTokenizer.from_pretrained("dslim/bert-base-NER") model = AutoModelForTokenClassification.from_pretrained("dslim/bert-base-NER") - Inference
- Notebooks
- Google Colab
- Kaggle
Update README.md
Browse files
README.md
CHANGED
|
@@ -49,7 +49,7 @@ If you'd like to use a larger BERT-large model fine-tuned on the same dataset, a
|
|
| 49 |
### Available NER models
|
| 50 |
| Model Name | Description | Parameters |
|
| 51 |
|-------------------|-------------|------------------|
|
| 52 |
-
| [distilbert-NER](https://huggingface.co/dslim/distilbert-NER) | Fine-tuned DistilBERT - a smaller, faster, lighter version of BERT | 66M |
|
| 53 |
| [bert-large-NER](https://huggingface.co/dslim/bert-large-NER/) | Fine-tuned bert-large-cased - larger model with slightly better performance | 340M |
|
| 54 |
| [bert-base-NER](https://huggingface.co/dslim/bert-base-NER)-([uncased](https://huggingface.co/dslim/bert-base-NER-uncased)) | Fine-tuned bert-base, available in both cased and uncased versions | 110M |
|
| 55 |
|
|
|
|
| 49 |
### Available NER models
|
| 50 |
| Model Name | Description | Parameters |
|
| 51 |
|-------------------|-------------|------------------|
|
| 52 |
+
| [distilbert-NER](https://huggingface.co/dslim/distilbert-NER) **(NEW!)** | Fine-tuned DistilBERT - a smaller, faster, lighter version of BERT | 66M |
|
| 53 |
| [bert-large-NER](https://huggingface.co/dslim/bert-large-NER/) | Fine-tuned bert-large-cased - larger model with slightly better performance | 340M |
|
| 54 |
| [bert-base-NER](https://huggingface.co/dslim/bert-base-NER)-([uncased](https://huggingface.co/dslim/bert-base-NER-uncased)) | Fine-tuned bert-base, available in both cased and uncased versions | 110M |
|
| 55 |
|