How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="quintic/codegen-diff-350M-gptj")
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("quintic/codegen-diff-350M-gptj")
model = AutoModelForCausalLM.from_pretrained("quintic/codegen-diff-350M-gptj")
Quick Links

This is a conversion of https://huggingface.co/CarperAI/diff-codegen-350m-v2 into GPT-J implementation via the script https://gist.github.com/moyix/7896575befbe1b99162ccfec8d135566

For details, please refer to Carper's model card. Anyone can do this conversion easily. Uploaded here simply for easy access without moving stuff back-and-forth.

Downloads last month
11
Safetensors
Model size
0.4B params
Tensor type
F32
·
U8
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support