These SAEs were trained on the outputs of each of the MLPs in EleutherAI/pythia-70m. We used 8.2 billion tokens from the Pile training set at a context length of 2049. The number of latents is 32,768.
Downloads last month
-
Downloads are not tracked for this model. How to track