File size: 173 Bytes
47021ee |
1 2 3 4 5 6 7 8 9 10 11 |
{
"architectures": ["GPT2LMHeadModel"],
"model_type": "gpt2",
"vocab_size": 50257,
"n_positions": 64,
"n_ctx": 64,
"n_embd": 64,
"n_layer": 2,
"n_head": 2
}
|