Instructions to use klue/bert-base with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use klue/bert-base with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="klue/bert-base")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("klue/bert-base") model = AutoModelForMaskedLM.from_pretrained("klue/bert-base") - Inference
- Notebooks
- Google Colab
- Kaggle
Build out the model card
#2
by Marissa - opened
LGTM! 👍
In addtion to bert-base, our team also released klue/roberta-small, roberta-base, and roberta-large. If available, could you update the model cards of roberta series?
moon1ite changed pull request status to merged