How to use from
llama.cpp
Install from brew
brew install llama.cpp
# Start a local OpenAI-compatible server with a web UI:
llama-server -hf igor273/phi-4-genaiscript:Q4_K_M
# Run inference directly in the terminal:
llama-cli -hf igor273/phi-4-genaiscript:Q4_K_M
Install from WinGet (Windows)
winget install llama.cpp
# Start a local OpenAI-compatible server with a web UI:
llama-server -hf igor273/phi-4-genaiscript:Q4_K_M
# Run inference directly in the terminal:
llama-cli -hf igor273/phi-4-genaiscript:Q4_K_M
Use pre-built binary
# Download pre-built binary from:
# https://github.com/ggerganov/llama.cpp/releases
# Start a local OpenAI-compatible server with a web UI:
./llama-server -hf igor273/phi-4-genaiscript:Q4_K_M
# Run inference directly in the terminal:
./llama-cli -hf igor273/phi-4-genaiscript:Q4_K_M
Build from source code
git clone https://github.com/ggerganov/llama.cpp.git
cd llama.cpp
cmake -B build
cmake --build build -j --target llama-server llama-cli
# Start a local OpenAI-compatible server with a web UI:
./build/bin/llama-server -hf igor273/phi-4-genaiscript:Q4_K_M
# Run inference directly in the terminal:
./build/bin/llama-cli -hf igor273/phi-4-genaiscript:Q4_K_M
Use Docker
docker model run hf.co/igor273/phi-4-genaiscript:Q4_K_M
Quick Links

Fine-tuned phi-4 on GenAIScript

This model is a fine-tuned version of microsoft/phi-4 on the GenAIScript training dataset. The base phi-4 model has no prior knowledge of the GenAIScript scripting language, as it was not part of its pretraining data. This fine-tuned version has been specifically trained to understand and generate valid GenAIScript code.

Model Description

  • Base model: microsoft/phi-4
  • Fine-tuned on: igor273/genaiscript_training_dataset
  • Task: Code generation and completion for GenAIScript
  • Quantized: Yes — optimized for local inference on resource-constrained machines

Dataset

The dataset was created from official Microsoft GenAIScript documentation and real-world code snippets. It includes:

  • Script generation examples
  • Function usage and syntax patterns
  • Control structures and logic flows
  • Valid use cases and best practices

Capabilities

  • Fully understands GenAIScript syntax and semantics
  • Can generate end-to-end scripts from natural language prompts
  • Can assist in learning and exploring GenAIScript capabilities

Limitations

  • May require updates if the GenAIScript specification evolves
  • Quantization may reduce generation precision in some edge cases

License

The base model phi-4 and the dataset are subject to their respective licenses. This fine-tuned version inherits those terms.


Maintained by @igor273

Downloads last month
34
Safetensors
Model size
15B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for igor273/phi-4-genaiscript

Base model

microsoft/phi-4
Quantized
(86)
this model
Quantizations
1 model