Transformers
PyTorch
Safetensors
Chinese
t5
text2text-generation
Text2Text Generation
T5
chinese
sentencepiece
text-generation-inference
Instructions to use IDEA-CCNL/Randeng-T5-77M-MultiTask-Chinese with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use IDEA-CCNL/Randeng-T5-77M-MultiTask-Chinese with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("IDEA-CCNL/Randeng-T5-77M-MultiTask-Chinese") model = AutoModelForSeq2SeqLM.from_pretrained("IDEA-CCNL/Randeng-T5-77M-MultiTask-Chinese") - Notebooks
- Google Colab
- Kaggle
Adding `safetensors` variant of this model
#1
by SFconvertbot - opened
This is an automated PR create with https://huggingface.co/spaces/safetensors/convert
This new file is equivalent to pytorch_model.bin but safe in the sense that
no arbitrary code can be put into it.
These files also happen to load much faster than their pytorch counterpart:
https://colab.research.google.com/github/huggingface/notebooks/blob/main/safetensors_doc/en/speed.ipynb
The widgets on your model page will run using this model even if this is not merged
making sure the file actually works.
Feel free to ignore this PR.
roygan changed pull request status to merged