eai-caia / layers.6 /cfg.json
kvsudarsh's picture
change
9287346
raw
history blame contribute delete
180 Bytes
{"activation": "topk", "expansion_factor": 2, "normalize_decoder": true, "num_latents": 0, "k": 32, "multi_topk": false, "skip_connection": false, "transcode": false, "d_in": 1152}