Instructions to use klue/bert-base with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use klue/bert-base with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="klue/bert-base")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("klue/bert-base") model = AutoModelForMaskedLM.from_pretrained("klue/bert-base") - Inference
- Notebooks
- Google Colab
- Kaggle
Add TF weights
Model converted by the transformers' pt_to_tf CLI -- all converted model outputs and hidden layers were validated against its Pytorch counterpart. Maximum crossload output difference=4.059e-05; Maximum converted output difference=4.059e-05.
Hi there 👋
I'm a TF maintainer at Hugging Face, and this is your most downloaded model whose weights can be automatically converted into TensorFlow, using our tools. We believe that having TF weights would be of interest to the community, and will further boost the visibility of the model.
I also don't want to be a source of spam! Let me know if you are interested in merging these TF weights, and if you would like me to open PRs with TF weights for other models that you own. Alternatively, if you'd like to have the TF weights but no hub notifications, I can also push the weights using admin privileges 🤗