YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

This is a tied-crosscoder of 16384*3 latents trained on the 14th layers of Gemma 2 2b and Gemma 2 2b it. We trained for 250M tokens on a mixed dataset of the pile and LMSys chat data.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support