400 / config.json
eyad-silx's picture
Add HFA checkpoint analysis and parameter mappings
2cdc30b verified
raw
history blame contribute delete
301 Bytes
{
"architectures": [
"HFALanguageModel"
],
"model_type": "hfa",
"vocab_size": 128000,
"hidden_size": 256,
"num_hidden_layers": 6,
"num_attention_heads": 8,
"intermediate_size": 1024,
"max_position_embeddings": 715,
"torch_dtype": "float32",
"transformers_version": "4.36.0"
}