YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
emirozturk/phi-3-mini-4k-t2c
This repository contains a fine-tuned model for Turkish code generation.
Checkpoint: step 10000
Dataset: Translated version of python_code_instructions_18k_alpaca
Example Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("emirozturk/phi-3-mini-4k-t2c")
tokenizer = AutoTokenizer.from_pretrained("emirozturk/phi-3-mini-4k-t2c")
prompt = "Python'da ax^2 + bx + c = 0 denkleminin x-kesişimlerini hesaplayan bir program yaz."
inputs = tokenizer(prompt, return_tensors="pt")
output = model.generate(**inputs, max_new_tokens=200)
print(tokenizer.decode(output[0], skip_special_tokens=True))
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support