Phi 4 30B — An "added layers" version of Phi-4 15B, without any knowledge loss

Info

I have added 40 more layers to the phi-4 15B, with 0 parameter initialization.

Hardware

An RTX 6000 Blackwell WS, with 96GB of VRAM, and the whole task was done within 20 minutes

Code

Can be found here in my repository.


Github Repo: ag-aryav/Phi-4-30B

Downloads last month
40
Safetensors
Model size
28B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for AryavA/Phi-4-30B

Base model

microsoft/phi-4
Finetuned
(91)
this model