Instructions to use b3x0m/bert-xomlac-ner with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use b3x0m/bert-xomlac-ner with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("token-classification", model="b3x0m/bert-xomlac-ner")# Load model directly from transformers import AutoTokenizer, AutoModelForTokenClassification tokenizer = AutoTokenizer.from_pretrained("b3x0m/bert-xomlac-ner") model = AutoModelForTokenClassification.from_pretrained("b3x0m/bert-xomlac-ner") - Notebooks
- Google Colab
- Kaggle
Too lazy to write something
New fine-tuned version from bert-base-uncased with my own dataset.
val_loss = 0.01966 | val_acc = 0.9811 | f-1 score = 0.91
- Downloads last month
- 24