Add library_name and improve documentation

#1
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +9 -4
README.md CHANGED
@@ -1,24 +1,27 @@
1
  ---
2
- license: mit
 
3
  datasets:
4
  - MultiCoNER/multiconer_v2
5
  language:
6
  - it
 
7
  metrics:
8
  - f1
9
  - precision
10
  - recall
11
- base_model:
12
- - FacebookAI/xlm-roberta-large
13
  pipeline_tag: token-classification
14
  tags:
15
  - NER
16
  - Named_Entity_Recognition
17
  pretty_name: MultiCoNER2 Italian XLM-RoBERTa
 
18
  ---
19
 
20
  **XLM-RoBERTa is fine-tuned on Italian [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) dataset for Fine-grained Named Entity Recognition.**
21
 
 
 
22
  The tagset of [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) is a fine-grained tagset. The fine to coarse level mapping of the tags are as follows:
23
 
24
  * Location (LOC) : Facility, OtherLOC, HumanSettlement, Station
@@ -71,6 +74,7 @@ If you use this model, please cite the following papers:
71
  ```bibtex
72
  @inproceedings{fetahu2023multiconer,
73
  title={MultiCoNER v2: a Large Multilingual dataset for Fine-grained and Noisy Named Entity Recognition},
 
74
  author={Fetahu, Besnik and Chen, Zhiyu and Kar, Sudipta and Rokhlenko, Oleg and Malmasi, Shervin},
75
  booktitle={Findings of the Association for Computational Linguistics: EMNLP 2023},
76
  pages={2027--2051},
@@ -93,4 +97,5 @@ If you use this model, please cite the following papers:
93
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
94
  volume={40},
95
  year={2026}
96
- }
 
 
1
  ---
2
+ base_model:
3
+ - FacebookAI/xlm-roberta-large
4
  datasets:
5
  - MultiCoNER/multiconer_v2
6
  language:
7
  - it
8
+ license: mit
9
  metrics:
10
  - f1
11
  - precision
12
  - recall
 
 
13
  pipeline_tag: token-classification
14
  tags:
15
  - NER
16
  - Named_Entity_Recognition
17
  pretty_name: MultiCoNER2 Italian XLM-RoBERTa
18
+ library_name: transformers
19
  ---
20
 
21
  **XLM-RoBERTa is fine-tuned on Italian [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) dataset for Fine-grained Named Entity Recognition.**
22
 
23
+ This model is an expert detector part of the **AWED-FiNER** collection, as presented in the paper [AWED-FiNER: Agents, Web applications, and Expert Detectors for Fine-grained Named Entity Recognition across 36 Languages for 6.6 Billion Speakers](https://huggingface.co/papers/2601.10161).
24
+
25
  The tagset of [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) is a fine-grained tagset. The fine to coarse level mapping of the tags are as follows:
26
 
27
  * Location (LOC) : Facility, OtherLOC, HumanSettlement, Station
 
74
  ```bibtex
75
  @inproceedings{fetahu2023multiconer,
76
  title={MultiCoNER v2: a Large Multilingual dataset for Fine-grained and Noisy Named Entity Recognition},
77
+ author={Fetahu, Besnik}
78
  author={Fetahu, Besnik and Chen, Zhiyu and Kar, Sudipta and Rokhlenko, Oleg and Malmasi, Shervin},
79
  booktitle={Findings of the Association for Computational Linguistics: EMNLP 2023},
80
  pages={2027--2051},
 
97
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
98
  volume={40},
99
  year={2026}
100
+ }
101
+ ```