Instructions to use KB/bert-base-swedish-cased-ner with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use KB/bert-base-swedish-cased-ner with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("token-classification", model="KB/bert-base-swedish-cased-ner")# Load model directly from transformers import AutoTokenizer, AutoModelForTokenClassification tokenizer = AutoTokenizer.from_pretrained("KB/bert-base-swedish-cased-ner") model = AutoModelForTokenClassification.from_pretrained("KB/bert-base-swedish-cased-ner") - Notebooks
- Google Colab
- Kaggle
Upload eval_results.txt
Browse files- eval_results.txt +4 -0
eval_results.txt
ADDED
|
@@ -0,0 +1,4 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
f1 = 0.9222488038277512
|
| 2 |
+
loss = 0.04856577956694042
|
| 3 |
+
precision = 0.9187321258341278
|
| 4 |
+
recall = 0.9257925072046109
|