How to use google-bert/bert-base-chinese with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="google-bert/bert-base-chinese")
# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("google-bert/bert-base-chinese") model = AutoModelForMaskedLM.from_pretrained("google-bert/bert-base-chinese")
Hi!πThis PR has a some additional information for the model card, based on the format we are using as part of our effort to standardise model cards at Hugging Face. Feel free to merge if you are ok with the changes! (cc @Marissa @Meg )
Nice to standardize!
Β· Sign up or log in to comment