Instructions to use monologg/kobert with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use monologg/kobert with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="monologg/kobert")# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("monologg/kobert") model = AutoModel.from_pretrained("monologg/kobert") - Notebooks
- Google Colab
- Kaggle
KoBERT
How to use
If you want to import KoBERT tokenizer with
AutoTokenizer, you should givetrust_remote_code=True.
from transformers import AutoModel, AutoTokenizer
model = AutoModel.from_pretrained("monologg/kobert")
tokenizer = AutoTokenizer.from_pretrained("monologg/kobert", trust_remote_code=True)
Reference
- Downloads last month
- 29,058