causalLm / config.json
Sathwik-kom's picture
commit files to HF hub
a5d2808
{
"architectures": [
"DecoderLayers"
],
"model_type": "gpt2",
"vocab_size": 35000,
"n_head": 6,
"n_layer": 6,
"n_embd": 384,
"max_position_embeddings": 512,
"dropout": 0.1,
"torch_dtype": "float32"
}