Instructions to use mykor/roberta-base-ko with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use mykor/roberta-base-ko with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="mykor/roberta-base-ko")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("mykor/roberta-base-ko") model = AutoModelForMaskedLM.from_pretrained("mykor/roberta-base-ko") - Notebooks
- Google Colab
- Kaggle
roberta-base-ko
A roberta model trained from the scratch. It has almost the same structure as roberta-base, but with a few differences.
- hidden_act:
mish - position_embedding_type:
relative_key_query
- Downloads last month
- 16