Instructions to use l3cube-pune/marathi-ner-iob with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use l3cube-pune/marathi-ner-iob with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("token-classification", model="l3cube-pune/marathi-ner-iob")# Load model directly from transformers import AutoTokenizer, AutoModelForTokenClassification tokenizer = AutoTokenizer.from_pretrained("l3cube-pune/marathi-ner-iob") model = AutoModelForTokenClassification.from_pretrained("l3cube-pune/marathi-ner-iob") - Notebooks
- Google Colab
- Kaggle
MahaNER-BERT-IOB
MahaNER-BERT-IOB is a MahaBERT(l3cube-pune/marathi-bert) model fine-tuned on L3Cube-MahaNER (with IOB format tags) - a Marathi named entity recognition dataset.
[dataset link] (https://github.com/l3cube-pune/MarathiNLP)
More details on the dataset, models, and baseline results can be found in our [paper] (https://arxiv.org/abs/2204.06029)
Non-IOB model: marathi-ner
@InProceedings{litake-EtAl:2022:WILDRE6,
author = {Litake, Onkar and Sabane, Maithili Ravindra and Patil, Parth Sachin and Ranade, Aparna Abhijeet and Joshi, Raviraj},
title = {L3Cube-MahaNER: A Marathi Named Entity Recognition Dataset and BERT models},
booktitle = {Proceedings of The WILDRE-6 Workshop within the 13th Language Resources and Evaluation Conference},
month = {June},
year = {2022},
address = {Marseille, France},
publisher = {European Language Resources Association},
pages = {29--34}
}
Other models from the family:
Marathi NER : marathi-ner
Marathi NER IOB : marathi-ner-iob
Marathi Social NER : marathi-social-ner
Marathi Social NER IOB : marathi-social-ner-iob
Marathi Mixed NER (MahNER + MahaSocialNER) : marathi-mixed-ner
Marathi Mixed NER IOB (MahNER IOB + MahaSocialNER IOB) : marathi-mixed-ner-iob
- Downloads last month
- 5