malm-165m / config.json
codelion's picture
Upload folder using huggingface_hub
54aa220 verified
raw
history blame contribute delete
180 Bytes
{
"vocab_size": 14407,
"d_model": 768,
"n_heads": 12,
"n_layers": 12,
"n_query_layers": 4,
"max_seq_len": 128,
"num_parameters": 165123656,
"num_functions": 2000
}