prachuryyaIITG nielsr HF Staff commited on
Commit
a299453
·
1 Parent(s): 5fa6883

Improve model card: add library_name, update paper link and GitHub link (#1)

Browse files

- Improve model card: add library_name, update paper link and GitHub link (146a124f11a97f3aac980ae18e296a5ce8240a11)


Co-authored-by: Niels Rogge <nielsr@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +22 -19
README.md CHANGED
@@ -1,28 +1,30 @@
1
  ---
2
- license: mit
3
- language:
4
- - ne
5
  base_model:
6
  - google/muril-large-cased
7
- pipeline_tag: token-classification
8
- tags:
9
- - NER
10
- - Named_Entity_Recognition
11
- pretty_name: SampurNER Nepali MuRIL
12
  datasets:
13
  - prachuryyaIITG/SampurNER
 
 
 
14
  metrics:
15
  - f1
16
  - precision
17
  - recall
 
 
 
 
 
 
18
  ---
19
 
20
  **MuRIL is fine-tuned on Nepali [SampurNER](https://huggingface.co/datasets/prachuryyaIITG/SampurNER) dataset for Fine-grained Named Entity Recognition. It is created using the [EaMaTa](https://github.com/PrachuryyaKaushik/SampurNER/blob/main/SampurNER_AAAI_extended.pdf) framework, utilizing the [Few-NERD](https://huggingface.co/datasets/DFKI-SLT/few-nerd) dataset.** <br>
21
 
22
- Read the paper: [SampurNER in AAAI-2026](https://github.com/PrachuryyaKaushik/SampurNER/blob/main/SampurNER_AAAI_extended.pdf)
23
-
24
- SampurNER Dataset: [datasets/prachuryyaIITG/SampurNER](https://huggingface.co/datasets/prachuryyaIITG/SampurNER)
25
 
 
 
 
26
 
27
  The tagset of [Few-NERD](https://aclanthology.org/2021.acl-long.248.pdf) is a fine-grained tagset. The fine to coarse level mapping of the tags are as follows:
28
 
@@ -80,14 +82,6 @@ print(result)
80
  If you use this model, please cite the following papers:
81
 
82
  ```bibtex
83
- @inproceedings{kaushik2026sampurner,
84
- title={SampurNER: Fine-grained Named Entity Recognition Dataset for 22 Indian Languages},
85
- author={Kaushik, Prachuryya and Anand, Ashish},
86
- booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
87
- volume={40},
88
- year={2026}
89
- }
90
-
91
  @misc{kaushik2026awedfineragentswebapplications,
92
  title={AWED-FiNER: Agents, Web applications, and Expert Detectors for Fine-grained Named Entity Recognition across 36 Languages for 6.6 Billion Speakers},
93
  author={Prachuryya Kaushik and Ashish Anand},
@@ -98,6 +92,14 @@ If you use this model, please cite the following papers:
98
  url={https://arxiv.org/abs/2601.10161},
99
  }
100
 
 
 
 
 
 
 
 
 
101
  @inproceedings{ding-etal-2021-nerd,
102
  title = "Few-{NERD}: A Few-shot Named Entity Recognition Dataset",
103
  author = "Ding, Ning and Xu, Guangwei and Chen, Yulin and Wang, Xiaobin and Han, Xu and Xie, Pengjun and Zheng, Haitao and Liu, Zhiyuan",
@@ -109,3 +111,4 @@ If you use this model, please cite the following papers:
109
  doi = "10.18653/v1/2021.acl-long.248",
110
  pages = "3198--3213",
111
  }
 
 
1
  ---
 
 
 
2
  base_model:
3
  - google/muril-large-cased
 
 
 
 
 
4
  datasets:
5
  - prachuryyaIITG/SampurNER
6
+ language:
7
+ - ne
8
+ license: mit
9
  metrics:
10
  - f1
11
  - precision
12
  - recall
13
+ pipeline_tag: token-classification
14
+ tags:
15
+ - NER
16
+ - Named_Entity_Recognition
17
+ pretty_name: SampurNER Nepali MuRIL
18
+ library_name: transformers
19
  ---
20
 
21
  **MuRIL is fine-tuned on Nepali [SampurNER](https://huggingface.co/datasets/prachuryyaIITG/SampurNER) dataset for Fine-grained Named Entity Recognition. It is created using the [EaMaTa](https://github.com/PrachuryyaKaushik/SampurNER/blob/main/SampurNER_AAAI_extended.pdf) framework, utilizing the [Few-NERD](https://huggingface.co/datasets/DFKI-SLT/few-nerd) dataset.** <br>
22
 
23
+ This model is part of the work presented in the paper [AWED-FiNER: Agents, Web applications, and Expert Detectors for Fine-grained Named Entity Recognition across 36 Languages for 6.6 Billion Speakers](https://huggingface.co/papers/2601.10161).
 
 
24
 
25
+ - **Code:** [https://github.com/PrachuryyaKaushik/AWED-FiNER](https://github.com/PrachuryyaKaushik/AWED-FiNER)
26
+ - **Interactive Demo:** [https://huggingface.co/spaces/prachuryyaIITG/AWED-FiNER](https://huggingface.co/spaces/prachuryyaIITG/AWED-FiNER)
27
+ - **Dataset:** [prachuryyaIITG/SampurNER](https://huggingface.co/datasets/prachuryyaIITG/SampurNER)
28
 
29
  The tagset of [Few-NERD](https://aclanthology.org/2021.acl-long.248.pdf) is a fine-grained tagset. The fine to coarse level mapping of the tags are as follows:
30
 
 
82
  If you use this model, please cite the following papers:
83
 
84
  ```bibtex
 
 
 
 
 
 
 
 
85
  @misc{kaushik2026awedfineragentswebapplications,
86
  title={AWED-FiNER: Agents, Web applications, and Expert Detectors for Fine-grained Named Entity Recognition across 36 Languages for 6.6 Billion Speakers},
87
  author={Prachuryya Kaushik and Ashish Anand},
 
92
  url={https://arxiv.org/abs/2601.10161},
93
  }
94
 
95
+ @inproceedings{kaushik2026sampurner,
96
+ title={SampurNER: Fine-grained Named Entity Recognition Dataset for 22 Indian Languages},
97
+ author={Kaushik, Prachuryya and Anand, Ashish},
98
+ booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
99
+ volume={40},
100
+ year={2026}
101
+ }
102
+
103
  @inproceedings{ding-etal-2021-nerd,
104
  title = "Few-{NERD}: A Few-shot Named Entity Recognition Dataset",
105
  author = "Ding, Ning and Xu, Guangwei and Chen, Yulin and Wang, Xiaobin and Han, Xu and Xie, Pengjun and Zheng, Haitao and Liu, Zhiyuan",
 
111
  doi = "10.18653/v1/2021.acl-long.248",
112
  pages = "3198--3213",
113
  }
114
+ ```