Instructions to use MickyMike/CT5 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use MickyMike/CT5 with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("MickyMike/CT5") model = AutoModelForSeq2SeqLM.from_pretrained("MickyMike/CT5") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- ca135fd7137209712c303b2d98a57b2823d8bb527ed1242bcc3302b6fed784df
- Size of remote file:
- 892 MB
- SHA256:
- 9bcd97cc5a1bbc19a89f618beb2fde1454290f2c67fdd23a247168da25cd2dec
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.