phi2-gpro / sft-model /config.json
HSinghHuggingFace's picture
phi2-gpro
c755398
raw
history blame contribute delete
136 Bytes
{
"architectures": [
"PhiForCausalLM"
],
"model_type": "phi",
"torch_dtype": "float16",
"transformers_version": "4.36.2"
}