OpenRP3B-Llama3.2 / config.json
ItsMeDevRoland's picture
Update config.json
ca4dd54 verified
raw
history blame contribute delete
110 Bytes
{
"model_type": "llama",
"hidden_size": 3200,
"num_attention_heads": 32,
"num_hidden_layers": 80
}