File size: 135 Bytes
eb0becd
1
{"architectures": ["LlamaForCausalLM"], "model_type": "llama", "hidden_size": 2048, "num_attention_heads": 32, "num_hidden_layers": 32}