# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("claudios/CodeGPT-Multilingual")
model = AutoModelForCausalLM.from_pretrained("claudios/CodeGPT-Multilingual")Quick Links
YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
This is an unofficial reupload of AISE-TUDelft/CodeGPT-Multilingual in the SafeTensors format using transformers 4.40.1. The goal of this reupload is to prevent older models that are still relevant baselines from becoming stale as a result of changes in HuggingFace. Additionally, I may include minor corrections, such as model max length configuration.
- Downloads last month
- 7
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="claudios/CodeGPT-Multilingual")