nielsr HF Staff commited on
Commit
d2adb82
·
verified ·
1 Parent(s): 34e4612

Add library_name metadata and improve model card structure

Browse files

This PR adds the `library_name: transformers` to the metadata section. This enables the "Use in Transformers" button and automated code snippets on the model page, making it easier for users to load the model.

The model card has also been updated to clearly reference the paper [AWED-FiNER: Agents, Web applications, and Expert Detectors for Fine-grained Named Entity Recognition across 36 Languages for 6.6 Billion Speakers](https://huggingface.co/papers/2601.10161), as this model is one of the expert detectors developed in that work.

Files changed (1) hide show
  1. README.md +8 -4
README.md CHANGED
@@ -1,16 +1,17 @@
1
  ---
2
- license: mit
 
3
  datasets:
4
  - MultiCoNER/multiconer_v2
5
  language:
6
  - zh
 
7
  metrics:
8
  - f1
9
  - precision
10
  - recall
11
- base_model:
12
- - FacebookAI/xlm-roberta-large
13
  pipeline_tag: token-classification
 
14
  tags:
15
  - NER
16
  - Named_Entity_Recognition
@@ -19,6 +20,8 @@ pretty_name: MultiCoNER2 Chinese XLM-RoBERTa
19
 
20
  **XLM-RoBERTa is fine-tuned on Chinese [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) dataset for Fine-grained Named Entity Recognition.**
21
 
 
 
22
  The tagset of [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) is a fine-grained tagset. The fine to coarse level mapping of the tags are as follows:
23
 
24
  * Location (LOC) : Facility, OtherLOC, HumanSettlement, Station
@@ -93,4 +96,5 @@ If you use this model, please cite the following papers:
93
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
94
  volume={40},
95
  year={2026}
96
- }
 
 
1
  ---
2
+ base_model:
3
+ - FacebookAI/xlm-roberta-large
4
  datasets:
5
  - MultiCoNER/multiconer_v2
6
  language:
7
  - zh
8
+ license: mit
9
  metrics:
10
  - f1
11
  - precision
12
  - recall
 
 
13
  pipeline_tag: token-classification
14
+ library_name: transformers
15
  tags:
16
  - NER
17
  - Named_Entity_Recognition
 
20
 
21
  **XLM-RoBERTa is fine-tuned on Chinese [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) dataset for Fine-grained Named Entity Recognition.**
22
 
23
+ This model is an expert detector part of the **AWED-FiNER** project, as described in the paper: [AWED-FiNER: Agents, Web applications, and Expert Detectors for Fine-grained Named Entity Recognition across 36 Languages for 6.6 Billion Speakers](https://huggingface.co/papers/2601.10161).
24
+
25
  The tagset of [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) is a fine-grained tagset. The fine to coarse level mapping of the tags are as follows:
26
 
27
  * Location (LOC) : Facility, OtherLOC, HumanSettlement, Station
 
96
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
97
  volume={40},
98
  year={2026}
99
+ }
100
+ ```