BitMamba-2-0.25B / config.json
rasatavohary's picture
Upload folder using huggingface_hub
1635175 verified
raw
history blame contribute delete
400 Bytes
{
"architectures": ["BitMamba2LM"],
"model_type": "bitmamba",
"d_model": 1024,
"n_layers": 24,
"n_heads": 16,
"vocab_size": 50257,
"ssm_d_state": 128,
"ssm_d_conv": 4,
"expand": 2,
"rms_norm_eps": 1e-6,
"quantization": {
"bits": 1.58,
"group_size": null,
"zero_point": false
},
"bos_token_id": 50256,
"eos_token_id": 50256,
"transformers_version": "5.0.0"
}