Instructions to use klue/roberta-large with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use klue/roberta-large with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="klue/roberta-large")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("klue/roberta-large") model = AutoModelForMaskedLM.from_pretrained("klue/roberta-large") - Inference
- Notebooks
- Google Colab
- Kaggle
Commit History
fix: update tokenizer 5193b95
docs: inference api template 785ec56
docs: add readme ec2ecc6
fix: model_max_length to 512 e09ea17
fix: model_max_length to 514 2c7612e
james.ryu commited on