Instructions to use melll-uff/bertweetbr with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use melll-uff/bertweetbr with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="melll-uff/bertweetbr")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("melll-uff/bertweetbr") model = AutoModelForMaskedLM.from_pretrained("melll-uff/bertweetbr") - Notebooks
- Google Colab
- Kaggle
Citation
#1
by finiteautomata - opened
Hi! Great work training this model.
Is there any resource to cite in our research for this model?
Thanks!
Hi @finiteautomata , thanks for your message. Yes, we are working on a paper to publish our work. hopefully, in few weeks we will release this and update the citation section of our model's card.
Regards.
Fernado Carneiro.
Great! Thanks Fernando. Please let me know when it is ready so I cite it.
Best,
Juan Manuel