nielsr HF Staff commited on
Commit
5cded2e
·
verified ·
1 Parent(s): 4c2102a

Add library_name and update model card

Browse files

Hi! I'm Niels from the Hugging Face community team.

I noticed this model is part of the AWED-FiNER project but was missing the `library_name` metadata. I've added `library_name: transformers` to the YAML header, which will enable the "Use in Transformers" button and automated code snippets on the model page.

I've also updated the description to more clearly link the model to the AWED-FiNER paper and collection.

Files changed (1) hide show
  1. README.md +8 -4
README.md CHANGED
@@ -1,16 +1,17 @@
1
  ---
2
- license: mit
 
3
  datasets:
4
  - MultiCoNER/multiconer_v2
5
  language:
6
  - bn
 
7
  metrics:
8
  - f1
9
  - precision
10
  - recall
11
- base_model:
12
- - FacebookAI/xlm-roberta-large
13
  pipeline_tag: token-classification
 
14
  tags:
15
  - NER
16
  - Named_Entity_Recognition
@@ -19,6 +20,8 @@ pretty_name: MultiCoNER2 Bengali XLM-RoBERTa
19
 
20
  **XLM-RoBERTa is fine-tuned on Bengali [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) dataset for Fine-grained Named Entity Recognition.**
21
 
 
 
22
  The tagset of [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) is a fine-grained tagset. The fine to coarse level mapping of the tags are as follows:
23
 
24
  * Location (LOC) : Facility, OtherLOC, HumanSettlement, Station
@@ -93,4 +96,5 @@ If you use this model, please cite the following papers:
93
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
94
  volume={40},
95
  year={2026}
96
- }
 
 
1
  ---
2
+ base_model:
3
+ - FacebookAI/xlm-roberta-large
4
  datasets:
5
  - MultiCoNER/multiconer_v2
6
  language:
7
  - bn
8
+ license: mit
9
  metrics:
10
  - f1
11
  - precision
12
  - recall
 
 
13
  pipeline_tag: token-classification
14
+ library_name: transformers
15
  tags:
16
  - NER
17
  - Named_Entity_Recognition
 
20
 
21
  **XLM-RoBERTa is fine-tuned on Bengali [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) dataset for Fine-grained Named Entity Recognition.**
22
 
23
+ This model is part of the **AWED-FiNER** project, as described in the paper: [AWED-FiNER: Agents, Web applications, and Expert Detectors for Fine-grained Named Entity Recognition across 36 Languages for 6.6 Billion Speakers](https://huggingface.co/papers/2601.10161).
24
+
25
  The tagset of [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) is a fine-grained tagset. The fine to coarse level mapping of the tags are as follows:
26
 
27
  * Location (LOC) : Facility, OtherLOC, HumanSettlement, Station
 
96
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
97
  volume={40},
98
  year={2026}
99
+ }
100
+ ```