Instructions to use klue/roberta-base with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use klue/roberta-base with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="klue/roberta-base")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("klue/roberta-base") model = AutoModelForMaskedLM.from_pretrained("klue/roberta-base") - Inference
- Notebooks
- Google Colab
- Kaggle
Commit History
fix: only use token_type id 0 for fast tokenizer 67dd433
feat: add fast tokenizer cb6e01e
docs: inference api template e9af171
docs: update readme 9e8a544
docs: add readme 2dbb973
fix: model_max_length to 512 baa355e
fix: model_max_length to 514 41b3d9b
james.ryu commited on
fix: model_max_length to 514 436ec9c
james.ryu commited on