nielsr HF Staff commited on
Commit
8f504ce
·
verified ·
1 Parent(s): ff375ac

Add library_name to metadata and improve model card documentation

Browse files

Hi! I'm Niels from the Hugging Face community science team.

This PR adds the `library_name: transformers` metadata tag to your model card. This will enable the "Use in Transformers" button and generate an automated code snippet on the model page, making it easier for users to load and use your model.

I have also kept your excellent documentation regarding the MultiCoNER2 dataset, model performance, and the sample usage for the AWED-FiNER agentic tool.

Files changed (1) hide show
  1. README.md +8 -4
README.md CHANGED
@@ -1,24 +1,27 @@
1
  ---
2
- license: mit
 
3
  datasets:
4
  - MultiCoNER/multiconer_v2
5
  language:
6
  - de
 
7
  metrics:
8
  - f1
9
  - precision
10
  - recall
11
- base_model:
12
- - FacebookAI/xlm-roberta-large
13
  pipeline_tag: token-classification
14
  tags:
15
  - NER
16
  - Named_Entity_Recognition
17
  pretty_name: MultiCoNER2 German XLM-RoBERTa
 
18
  ---
19
 
20
  **XLM-RoBERTa is fine-tuned on German [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) dataset for Fine-grained Named Entity Recognition.**
21
 
 
 
22
  The tagset of [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) is a fine-grained tagset. The fine to coarse level mapping of the tags are as follows:
23
 
24
  * Location (LOC) : Facility, OtherLOC, HumanSettlement, Station
@@ -93,4 +96,5 @@ If you use this model, please cite the following papers:
93
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
94
  volume={40},
95
  year={2026}
96
- }
 
 
1
  ---
2
+ base_model:
3
+ - FacebookAI/xlm-roberta-large
4
  datasets:
5
  - MultiCoNER/multiconer_v2
6
  language:
7
  - de
8
+ license: mit
9
  metrics:
10
  - f1
11
  - precision
12
  - recall
 
 
13
  pipeline_tag: token-classification
14
  tags:
15
  - NER
16
  - Named_Entity_Recognition
17
  pretty_name: MultiCoNER2 German XLM-RoBERTa
18
+ library_name: transformers
19
  ---
20
 
21
  **XLM-RoBERTa is fine-tuned on German [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) dataset for Fine-grained Named Entity Recognition.**
22
 
23
+ This model is part of the [AWED-FiNER](https://huggingface.co/papers/2601.10161) framework, providing fine-grained NER solutions.
24
+
25
  The tagset of [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) is a fine-grained tagset. The fine to coarse level mapping of the tags are as follows:
26
 
27
  * Location (LOC) : Facility, OtherLOC, HumanSettlement, Station
 
96
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
97
  volume={40},
98
  year={2026}
99
+ }
100
+ ```