Add `library_name: transformers` to metadata
Browse filesHi there! I'm Niels from the Hugging Face community science team.
This PR adds the `library_name: transformers` metadata to your model card. This will enable the "Use in Transformers" button and generate automated code snippets for your model, making it more accessible to the community.
Everything else in your model card looks excellent!
README.md
CHANGED
|
@@ -1,20 +1,21 @@
|
|
| 1 |
---
|
| 2 |
-
license: mit
|
| 3 |
-
language:
|
| 4 |
-
- mr
|
| 5 |
base_model:
|
| 6 |
- google/muril-large-cased
|
| 7 |
-
pipeline_tag: token-classification
|
| 8 |
-
tags:
|
| 9 |
-
- NER
|
| 10 |
-
- Named_Entity_Recognition
|
| 11 |
-
pretty_name: CLASSER Marathi MuRIL
|
| 12 |
datasets:
|
| 13 |
- prachuryyaIITG/CLASSER
|
|
|
|
|
|
|
|
|
|
| 14 |
metrics:
|
| 15 |
- f1
|
| 16 |
- precision
|
| 17 |
- recall
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 18 |
---
|
| 19 |
|
| 20 |
**MuRIL is fine-tuned on Marathi [CLASSER](https://huggingface.co/datasets/prachuryyaIITG/CLASSER) dataset for Fine-grained Named Entity Recognition.**
|
|
@@ -110,4 +111,5 @@ If you use this model, please cite the following papers:
|
|
| 110 |
booktitle={Findings of the Association for Computational Linguistics: EMNLP 2023},
|
| 111 |
pages={2027--2051},
|
| 112 |
year={2023}
|
| 113 |
-
}
|
|
|
|
|
|
| 1 |
---
|
|
|
|
|
|
|
|
|
|
| 2 |
base_model:
|
| 3 |
- google/muril-large-cased
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 4 |
datasets:
|
| 5 |
- prachuryyaIITG/CLASSER
|
| 6 |
+
language:
|
| 7 |
+
- mr
|
| 8 |
+
license: mit
|
| 9 |
metrics:
|
| 10 |
- f1
|
| 11 |
- precision
|
| 12 |
- recall
|
| 13 |
+
pipeline_tag: token-classification
|
| 14 |
+
library_name: transformers
|
| 15 |
+
tags:
|
| 16 |
+
- NER
|
| 17 |
+
- Named_Entity_Recognition
|
| 18 |
+
pretty_name: CLASSER Marathi MuRIL
|
| 19 |
---
|
| 20 |
|
| 21 |
**MuRIL is fine-tuned on Marathi [CLASSER](https://huggingface.co/datasets/prachuryyaIITG/CLASSER) dataset for Fine-grained Named Entity Recognition.**
|
|
|
|
| 111 |
booktitle={Findings of the Association for Computational Linguistics: EMNLP 2023},
|
| 112 |
pages={2027--2051},
|
| 113 |
year={2023}
|
| 114 |
+
}
|
| 115 |
+
```
|