--- license: apache-2.0 language: en tags: - code-generation model_type: gpt2 --- # MBPP SFT CodeParrot Fine-tuned variant of codeparrot-small on 500 MBPP samples. ## Usage ```python from transformers import AutoTokenizer, AutoModelForCausalLM m = "vanishingradient/mbpp-sft-codeparrot" tok = AutoTokenizer.from_pretrained(m) mod = AutoModelForCausalLM.from_pretrained(m) x = tok("Write a python function to reverse a string.", return_tensors="pt") y = mod.generate(**x, max_new_tokens=120) print(tok.decode(y[0], skip_special_tokens=True))