How to use from the
Use from the
Transformers library
# Load model directly
from transformers import AutoModelForSeq2SeqLM
model = AutoModelForSeq2SeqLM.from_pretrained("subbu264/codeT5p_2b_finetune", trust_remote_code=True, dtype="auto")
Quick Links
README.md exists but content is empty.
Downloads last month
5
Safetensors
Model size
3B params
Tensor type
F32
·
U8
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support