How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("text-generation", model="igor273/phi-4-genaiscript")
messages = [
    {"role": "user", "content": "Who are you?"},
]
pipe(messages)
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("igor273/phi-4-genaiscript")
model = AutoModelForCausalLM.from_pretrained("igor273/phi-4-genaiscript")
messages = [
    {"role": "user", "content": "Who are you?"},
]
inputs = tokenizer.apply_chat_template(
	messages,
	add_generation_prompt=True,
	tokenize=True,
	return_dict=True,
	return_tensors="pt",
).to(model.device)

outputs = model.generate(**inputs, max_new_tokens=40)
print(tokenizer.decode(outputs[0][inputs["input_ids"].shape[-1]:]))
Quick Links

Fine-tuned phi-4 on GenAIScript

This model is a fine-tuned version of microsoft/phi-4 on the GenAIScript training dataset. The base phi-4 model has no prior knowledge of the GenAIScript scripting language, as it was not part of its pretraining data. This fine-tuned version has been specifically trained to understand and generate valid GenAIScript code.

Model Description

  • Base model: microsoft/phi-4
  • Fine-tuned on: igor273/genaiscript_training_dataset
  • Task: Code generation and completion for GenAIScript
  • Quantized: Yes — optimized for local inference on resource-constrained machines

Dataset

The dataset was created from official Microsoft GenAIScript documentation and real-world code snippets. It includes:

  • Script generation examples
  • Function usage and syntax patterns
  • Control structures and logic flows
  • Valid use cases and best practices

Capabilities

  • Fully understands GenAIScript syntax and semantics
  • Can generate end-to-end scripts from natural language prompts
  • Can assist in learning and exploring GenAIScript capabilities

Limitations

  • May require updates if the GenAIScript specification evolves
  • Quantization may reduce generation precision in some edge cases

License

The base model phi-4 and the dataset are subject to their respective licenses. This fine-tuned version inherits those terms.


Maintained by @igor273

Downloads last month
34
Safetensors
Model size
15B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for igor273/phi-4-genaiscript

Base model

microsoft/phi-4
Quantized
(86)
this model
Quantizations
1 model