How to use Salesforce/codet5p-2b with Transformers:
# Load model directly from transformers import AutoModelForSeq2SeqLM model = AutoModelForSeq2SeqLM.from_pretrained("Salesforce/codet5p-2b", trust_remote_code=True, dtype="auto")
When I try running through model card interface (see screenshot) and deploying my own HF inference endpoint, I get "EOF when reading a line" error message.
· Sign up or log in to comment