How to use OpenMOSS-Team/bart-base-chinese with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("OpenMOSS-Team/bart-base-chinese") model = AutoModelForSeq2SeqLM.from_pretrained("OpenMOSS-Team/bart-base-chinese")
我喺呢度打咗「北京是[MASK]的首都」之後,提示 Can't load tokenizer using from_pretrained, please update its configuration: expected str, bytes or os.PathLike object, not NoneType
您好,Hosted inference API是huggingface自带的,因为BART-Chinese和原始BART的tokenizer不同,这个API是无法使用的,可以参考README下载使用
唔該晒!
· Sign up or log in to comment