nielsr HF Staff commited on
Commit
10e3b2c
·
verified ·
1 Parent(s): 1eed622

Fix metadata inconsistencies, add library_name and paper/code links

Browse files

Hi! I'm Niels from the Hugging Face community science team. I'm opening this PR to improve the documentation of your model.

Based on the `config.json` file in this repository, it appears this model is based on `xlm-roberta-large` rather than MuRIL, and the language code `as` indicates it is for Assamese. I have updated the metadata to reflect this.

Additionally, I've:
- Added `library_name: transformers` to the metadata to enable the "Use in Transformers" button and automated code snippets.
- Linked the model to its official paper page on Hugging Face: [AWED-FiNER: Agents, Web applications, and Expert Detectors for Fine-grained Named Entity Recognition across 36 Languages for 6.6 Billion Speakers](https://huggingface.co/papers/2601.10161).
- Added a link to the GitHub repository.

These changes will help users discover and use your model more effectively!

Files changed (1) hide show
  1. README.md +14 -8
README.md CHANGED
@@ -1,23 +1,29 @@
1
  ---
2
- license: mit
 
 
 
3
  language:
4
  - as
 
5
  metrics:
6
  - f1
7
  - precision
8
  - recall
9
- base_model:
10
- - google/muril-large-cased
11
  pipeline_tag: token-classification
12
  tags:
13
  - NER
14
  - Named_Entity_Recognition
15
- pretty_name: APTFiNER Tamil MuRIL
16
- datasets:
17
- - prachuryyaIITG/APTFiNER
18
  ---
19
 
20
- **MuRIL is fine-tuned on Assamese APTFiNER dataset for Fine-grained Named Entity Recognition.**
 
 
 
 
 
21
 
22
  The tagset of [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) is a fine-grained tagset. The fine to coarse level mapping of the tags are as follows:
23
 
@@ -109,4 +115,4 @@ If you use this model, please cite the following papers:
109
  pages={2027--2051},
110
  year={2023}
111
  }
112
- ```
 
1
  ---
2
+ base_model:
3
+ - FacebookAI/xlm-roberta-large
4
+ datasets:
5
+ - prachuryyaIITG/APTFiNER
6
  language:
7
  - as
8
+ license: mit
9
  metrics:
10
  - f1
11
  - precision
12
  - recall
 
 
13
  pipeline_tag: token-classification
14
  tags:
15
  - NER
16
  - Named_Entity_Recognition
17
+ pretty_name: APTFiNER Assamese XLM-R
18
+ library_name: transformers
 
19
  ---
20
 
21
+ **This model is fine-tuned on the Assamese APTFiNER dataset for Fine-grained Named Entity Recognition.**
22
+
23
+ It is part of the **AWED-FiNER** collection, as presented in the paper [AWED-FiNER: Agents, Web applications, and Expert Detectors for Fine-grained Named Entity Recognition across 36 Languages for 6.6 Billion Speakers](https://huggingface.co/papers/2601.10161).
24
+
25
+ - **Code:** [GitHub - AWED-FiNER](https://github.com/PrachuryyaKaushik/AWED-FiNER)
26
+ - **Interactive Demo:** [Hugging Face Space](https://huggingface.co/spaces/prachuryyaIITG/AWED-FiNER)
27
 
28
  The tagset of [MultiCoNER2](https://huggingface.co/datasets/MultiCoNER/multiconer_v2) is a fine-grained tagset. The fine to coarse level mapping of the tags are as follows:
29
 
 
115
  pages={2027--2051},
116
  year={2023}
117
  }
118
+ ```