Wrong config.json

#3
by Nabbers1999 - opened

You have the config.json from the 12B uploaded with your 27B, it has the wrong shape.

Your 12B and 27B
"text_config": {
"hidden_size": 3840,
"intermediate_size": 15360,
"model_type": "gemma3_text",
"num_attention_heads": 16,
"num_hidden_layers": 48,
"num_key_value_heads": 8,
"rope_scaling": {
"factor": 8.0,
"rope_type": "linear"
},

The real 27B:
"text_config": {
"head_dim": 128,
"hidden_size": 5376,
"intermediate_size": 21504,
"model_type": "gemma3_text",
"num_attention_heads": 32,
"num_hidden_layers": 62,
"num_key_value_heads": 16,
"query_pre_attn_scalar": 168,
"rope_scaling": {
"factor": 8.0,
"rope_type": "linear"
},

Sign up or log in to comment