L3Cube-MahaCorpus and MahaBERT: Marathi Monolingual Corpus, Marathi BERT Language Models, and Resources
Paper • 2202.01159 • Published
How to use l3cube-pune/marathi-bert with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("fill-mask", model="l3cube-pune/marathi-bert") # Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("l3cube-pune/marathi-bert")
model = AutoModelForMaskedLM.from_pretrained("l3cube-pune/marathi-bert")# Load model directly
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("l3cube-pune/marathi-bert")
model = AutoModelForMaskedLM.from_pretrained("l3cube-pune/marathi-bert")New version of this model is available here: https://huggingface.co/l3cube-pune/marathi-bert-v2
MahaBERT is a Marathi BERT model. It is a multilingual BERT (bert-base-multilingual-cased) model fine-tuned on L3Cube-MahaCorpus and other publicly available Marathi monolingual datasets. [dataset link] (https://github.com/l3cube-pune/MarathiNLP)
More details on the dataset, models, and baseline results can be found in our [paper] (https://arxiv.org/abs/2202.01159)
@InProceedings{joshi:2022:WILDRE6,
author = {Joshi, Raviraj},
title = {L3Cube-MahaCorpus and MahaBERT: Marathi Monolingual Corpus, Marathi BERT Language Models, and Resources},
booktitle = {Proceedings of The WILDRE-6 Workshop within the 13th Language Resources and Evaluation Conference},
month = {June},
year = {2022},
address = {Marseille, France},
publisher = {European Language Resources Association},
pages = {97--101}
}
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="l3cube-pune/marathi-bert")