nielsr HF Staff commited on
Commit
4720d51
·
verified ·
1 Parent(s): 32a81f3

Add library_name and improve metadata

Browse files

Hi! I'm Niels from the Hugging Face community science team. I noticed that this model is part of the AWED-FiNER project and is compatible with the `transformers` library.

I've opened this PR to:
- Add `library_name: transformers` to the metadata to enable the "Use in Transformers" button and automated code snippets.
- Maintain the `token-classification` pipeline tag for better discoverability.
- Ensure the paper and GitHub repository are clearly linked in the model card.

Thanks for your contribution to the community!

Files changed (1) hide show
  1. README.md +8 -5
README.md CHANGED
@@ -1,16 +1,17 @@
1
  ---
2
- license: mit
 
3
  datasets:
4
  - MultiCoNER/multiconer_v2
5
  language:
6
  - en
 
7
  metrics:
8
  - f1
9
  - precision
10
  - recall
11
- base_model:
12
- - FacebookAI/xlm-roberta-large
13
  pipeline_tag: token-classification
 
14
  tags:
15
  - NER
16
  - Named_Entity_Recognition
@@ -19,6 +20,8 @@ pretty_name: MultiCoNER2 English XLM-RoBERTa
19
 
20
  **XLM-RoBERTa is fine-tuned on English [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) dataset for Fine-grained Named Entity Recognition.**
21
 
 
 
22
  The tagset of [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) is a fine-grained tagset. The fine to coarse level mapping of the tags are as follows:
23
 
24
  * Location (LOC) : Facility, OtherLOC, HumanSettlement, Station
@@ -28,6 +31,7 @@ The tagset of [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multicone
28
  * Product (PROD) : Clothing, Vehicle, Food, Drink, OtherPROD
29
  * Medical (MED) : Medication/Vaccine, MedicalProcedure, AnatomicalStructure, Symptom, Disease
30
 
 
31
 
32
  ## Model performance:
33
  Precision: 78.29 <br>
@@ -41,8 +45,6 @@ Learning Rate: 5e-5 <br>
41
  Weight Decay: 0.01 <br>
42
  Batch Size: 64 <br>
43
 
44
- [**AWED-FiNER collection**](https://huggingface.co/collections/prachuryyaIITG/awed-finer) | [**Paper**](https://huggingface.co/papers/2601.10161) | [**Agentic Tool**](https://github.com/PrachuryyaKaushik/AWED-FiNER) | [**Interactive Demo**](https://huggingface.co/spaces/prachuryyaIITG/AWED-FiNER)
45
-
46
  ## Sample Usage of Agentic Tool
47
 
48
  The AWED-FiNER agentic tool can be used to interact with expert models trained using this framework. Below is an example:
@@ -94,3 +96,4 @@ If you use this model, please cite the following papers:
94
  volume={40},
95
  year={2026}
96
  }
 
 
1
  ---
2
+ base_model:
3
+ - FacebookAI/xlm-roberta-large
4
  datasets:
5
  - MultiCoNER/multiconer_v2
6
  language:
7
  - en
8
+ license: mit
9
  metrics:
10
  - f1
11
  - precision
12
  - recall
 
 
13
  pipeline_tag: token-classification
14
+ library_name: transformers
15
  tags:
16
  - NER
17
  - Named_Entity_Recognition
 
20
 
21
  **XLM-RoBERTa is fine-tuned on English [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) dataset for Fine-grained Named Entity Recognition.**
22
 
23
+ This model is part of the **AWED-FiNER** project, presented in the paper [AWED-FiNER: Agents, Web applications, and Expert Detectors for Fine-grained Named Entity Recognition across 36 Languages for 6.6 Billion Speakers](https://huggingface.co/papers/2601.10161).
24
+
25
  The tagset of [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) is a fine-grained tagset. The fine to coarse level mapping of the tags are as follows:
26
 
27
  * Location (LOC) : Facility, OtherLOC, HumanSettlement, Station
 
31
  * Product (PROD) : Clothing, Vehicle, Food, Drink, OtherPROD
32
  * Medical (MED) : Medication/Vaccine, MedicalProcedure, AnatomicalStructure, Symptom, Disease
33
 
34
+ [**AWED-FiNER collection**](https://huggingface.co/collections/prachuryyaIITG/awed-finer) | [**Paper**](https://huggingface.co/papers/2601.10161) | [**Agentic Tool**](https://github.com/PrachuryyaKaushik/AWED-FiNER) | [**Interactive Demo**](https://huggingface.co/spaces/prachuryyaIITG/AWED-FiNER)
35
 
36
  ## Model performance:
37
  Precision: 78.29 <br>
 
45
  Weight Decay: 0.01 <br>
46
  Batch Size: 64 <br>
47
 
 
 
48
  ## Sample Usage of Agentic Tool
49
 
50
  The AWED-FiNER agentic tool can be used to interact with expert models trained using this framework. Below is an example:
 
96
  volume={40},
97
  year={2026}
98
  }
99
+ ```