# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("codeparrot/codeparrot-small-code-to-text")
model = AutoModelForCausalLM.from_pretrained("codeparrot/codeparrot-small-code-to-text")Quick Links
CodeParrot π¦ small for text-t-code generation
This model is CodeParrot-small (from branch megatron) fine-tuned on github-jupyter-code-to-text, a dataset where the samples are a succession of Python code and its explanation as a docstring, originally extracted from Jupyter notebooks parsed in this dataset.
- Downloads last month
- 17
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="codeparrot/codeparrot-small-code-to-text")