leap0 / config.json
PulkundwarP's picture
Upload 8 files
27887ec verified
raw
history blame contribute delete
156 Bytes
{
"block_size": 768,
"dropout": 0.1,
"model_type": "custom_gpt",
"n_embd": 768,
"n_head": 8,
"n_layer": 8,
"vocab_size": 50304
}