Instructions to use zhaochaofeng/chat-t5 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use zhaochaofeng/chat-t5 with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("zhaochaofeng/chat-t5") model = AutoModel.from_pretrained("zhaochaofeng/chat-t5") - Notebooks
- Google Colab
- Kaggle
YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
T5中文对话模型
快速使用
from transformers import AutoTokenizer
from transformers import AutoModelForSeq2SeqLM
model = 'zhaochaofeng/chat-t5'
tokenizer = AutoTokenizer.from_pretrained(model)
model = AutoModelForSeq2SeqLM.from_pretrained(model)
txt = '请介绍一下什么是机器学习'
inputs = tokenizer(text=txt, return_tensors='pt')
outputs = model.generate(inputs=inputs.input_ids, max_new_tokens=300, do_sample=True)
res = tokenizer.batch_decode(sequences=outputs, skip_special_tokens=True)
print(res)
- Downloads last month
- 3
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support