nielsr HF Staff commited on
Commit
8856079
·
verified ·
1 Parent(s): d40b8b3

Add library_name metadata and GitHub link

Browse files

This PR adds `library_name: transformers` to the YAML metadata to enable automated code snippets and the "Use in Transformers" button on the model page. It also improves the model description by explicitly linking it to the **AWED-FiNER** project and its corresponding research paper.

Files changed (1) hide show
  1. README.md +8 -4
README.md CHANGED
@@ -1,16 +1,17 @@
1
  ---
2
- license: mit
 
3
  datasets:
4
  - MultiCoNER/multiconer_v2
5
  language:
6
  - sv
 
7
  metrics:
8
  - f1
9
  - precision
10
  - recall
11
- base_model:
12
- - FacebookAI/xlm-roberta-large
13
  pipeline_tag: token-classification
 
14
  tags:
15
  - NER
16
  - Named_Entity_Recognition
@@ -19,6 +20,8 @@ pretty_name: MultiCoNER2 Swedish XLM-RoBERTa
19
 
20
  **XLM-RoBERTa is fine-tuned on Swedish [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) dataset for Fine-grained Named Entity Recognition.**
21
 
 
 
22
  The tagset of [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) is a fine-grained tagset. The fine to coarse level mapping of the tags are as follows:
23
 
24
  * Location (LOC) : Facility, OtherLOC, HumanSettlement, Station
@@ -41,7 +44,7 @@ Learning Rate: 5e-5 <br>
41
  Weight Decay: 0.01 <br>
42
  Batch Size: 64 <br>
43
 
44
- [**AWED-FiNER collection**](https://huggingface.co/collections/prachuryyaIITG/awed-finer) | [**Paper**](https://huggingface.co/papers/2601.10161) | [**Agentic Tool**](https://github.com/PrachuryyaKaushik/AWED-FiNER) | [**Interactive Demo**](https://huggingface.co/spaces/prachuryyaIITG/AWED-FiNER)
45
 
46
  ## Sample Usage of Agentic Tool
47
 
@@ -94,3 +97,4 @@ If you use this model, please cite the following papers:
94
  volume={40},
95
  year={2026}
96
  }
 
 
1
  ---
2
+ base_model:
3
+ - FacebookAI/xlm-roberta-large
4
  datasets:
5
  - MultiCoNER/multiconer_v2
6
  language:
7
  - sv
8
+ license: mit
9
  metrics:
10
  - f1
11
  - precision
12
  - recall
 
 
13
  pipeline_tag: token-classification
14
+ library_name: transformers
15
  tags:
16
  - NER
17
  - Named_Entity_Recognition
 
20
 
21
  **XLM-RoBERTa is fine-tuned on Swedish [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) dataset for Fine-grained Named Entity Recognition.**
22
 
23
+ This model is part of the **AWED-FiNER** project, presented in the paper: [AWED-FiNER: Agents, Web applications, and Expert Detectors for Fine-grained Named Entity Recognition across 36 Languages for 6.6 Billion Speakers](https://huggingface.co/papers/2601.10161).
24
+
25
  The tagset of [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) is a fine-grained tagset. The fine to coarse level mapping of the tags are as follows:
26
 
27
  * Location (LOC) : Facility, OtherLOC, HumanSettlement, Station
 
44
  Weight Decay: 0.01 <br>
45
  Batch Size: 64 <br>
46
 
47
+ [**AWED-FiNER collection**](https://huggingface.co/collections/prachuryyaIITG/awed-finer) | [**Paper**](https://huggingface.co/papers/2601.10161) | [**GitHub Repository**](https://github.com/PrachuryyaKaushik/AWED-FiNER) | [**Interactive Demo**](https://huggingface.co/spaces/prachuryyaIITG/AWED-FiNER)
48
 
49
  ## Sample Usage of Agentic Tool
50
 
 
97
  volume={40},
98
  year={2026}
99
  }
100
+ ```