Text Generation
Transformers
PyTorch
code
gpt2
generation
text-generation-inference
How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="codeparrot/codeparrot-small-code-to-text")
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("codeparrot/codeparrot-small-code-to-text")
model = AutoModelForCausalLM.from_pretrained("codeparrot/codeparrot-small-code-to-text")
Quick Links

CodeParrot 🦜 small for text-t-code generation

This model is CodeParrot-small (from branch megatron) fine-tuned on github-jupyter-code-to-text, a dataset where the samples are a succession of Python code and its explanation as a docstring, originally extracted from Jupyter notebooks parsed in this dataset.

Downloads last month
17
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Datasets used to train codeparrot/codeparrot-small-code-to-text

Spaces using codeparrot/codeparrot-small-code-to-text 3