# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("transformersbook/codeparrot")
model = AutoModelForCausalLM.from_pretrained("transformersbook/codeparrot")Quick Links
YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
CodeParrot
CodeParrot (large) is a 1.5B parameter GPT-2 model trained on the CodeParrot Python code dataset. The model is trained in Chapter 10: Training Transformers from Scratch in the NLP with Transformers book. You can find the full code in the accompanying Github repository.
- Downloads last month
- 38
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="transformersbook/codeparrot")