Mistral7B-SAE / cfg.json
divij30's picture
Upload 2 files
1d49649 verified
raw
history blame contribute delete
282 Bytes
{"d_in": 4096, "d_sae": 65536, "dtype": "float32", "device": "cpu", "model_name": "mistral-7b", "hook_point": "blocks.16.hook_resid_pre", "hook_point_layer": 16, "hook_point_head_index": null, "prepend_bos": false, "dataset_path": "monology/pile-uncopyrighted", "context_size": 256}