mlx_slm / config.json
wzebrowski's picture
Upload config.json with huggingface_hub
d780652 verified
{
"vocab_size": 50096,
"n_layer": 18,
"hidden_size": 768,
"intermediate_size": 3072,
"n_head": 12,
"context_size": 512,
"rms_norm_eps": 1e-6,
"dropout": 0.1,
"bos_token_id": 1,
"eos_token_id": 2
}