llama2-7b-base-mnn / llm_config.json
Yanbo2's picture
Upload folder using huggingface_hub
5d42957 verified
{
"hidden_size": 4096,
"layer_nums": 32,
"attention_mask": "float",
"key_value_shape": [
2,
1,
0,
32,
128
],
"bos": "",
"system_prompt_template": "%s",
"user_prompt_template": "%s",
"assistant_prompt_template": "%s",
"is_visual": false
}