Instructions to use google-bert/bert-base-chinese with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use google-bert/bert-base-chinese with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="google-bert/bert-base-chinese")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("google-bert/bert-base-chinese") model = AutoModelForMaskedLM.from_pretrained("google-bert/bert-base-chinese") - Inference
- Notebooks
- Google Colab
- Kaggle
Commit History
upload flax model 4b1f5fb
allow flax 4f3e683
Migrate model card from transformers-repo a58e1a5
For clarity, delete deprecated modelcard.json 8585f10
Add tokenizer configuration 112e219
Thomas Wolf commited on