gpt2-urdu / config.json
zulqarnain-kernel's picture
Training in progress, epoch 1
fe7735b verified
raw
history blame contribute delete
232 Bytes
{
"architectures": [
"GPTModelHF"
],
"context_length": 1024,
"drop_rate": 0.1,
"emb_dim": 768,
"n_heads": 12,
"n_layers": 12,
"torch_dtype": "float32",
"transformers_version": "4.49.0",
"vocab_size": 50257
}