Llava_Next_TopKSAE / config.json
Mayfull's picture
Training in progress, step 10000
fef976a verified
raw
history blame contribute delete
267 Bytes
{
"architectures": [
"TopKSAE"
],
"dtype": "float32",
"expansion_factor": 32,
"hidden_size": 4096,
"k": 256,
"latent_size": 131072,
"model_type": "topk_sae",
"multi_topk": false,
"normalize_decoder": true,
"transformers_version": "4.57.6"
}