BABYAI / Configuration.json
Hulk810154's picture
Create Configuration.json
97d7274 verified
raw
history blame contribute delete
296 Bytes
{
"architectures": ["MistralForCausalLM"],
"model_type": "mistral",
"vocab_size": 32000,
"hidden_size": 4096,
"num_hidden_layers": 32,
"num_attention_heads": 32,
"intermediate_size": 14336,
"max_position_embeddings": 32768,
"initializer_range": 0.02,
"rope_theta": 1000000.0
}