# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("quintic/codegen-diff-6B-gptj")
model = AutoModelForCausalLM.from_pretrained("quintic/codegen-diff-6B-gptj")Quick Links
This is a conversion of https://huggingface.co/CarperAI/diff-codegen-350m-v2 into GPT-J implementation via the script https://gist.github.com/moyix/7896575befbe1b99162ccfec8d135566
For details, please refer to Carper's model card. Anyone can do this conversion easily. Uploaded here simply for easy access without moving stuff back-and-forth.
- Downloads last month
- 18
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="quintic/codegen-diff-6B-gptj")