| --- |
| license: apache-2.0 |
| library_name: saelens |
| --- |
| |
| # 1. GemmaScope |
|
|
| Gemmascope is TODO |
|
|
| # 2. What Is `gemmascope-9b-it-res`? |
|
|
| - `gemmascope-`: See 1. |
| - `9b-it-`: These SAEs were trained on the Gemma v2 9B IT (instruction tuned) model (TODO link). |
| - `res`: These SAEs were trained on the states on the residual stream. |
| - |
| ## 3. GTM FAQ (TODO(conmy): delete for main rollout) |
|
|
| Q1: Why does this model exist in `gg-hf`? |
|
|
| A1: See https://docs.google.com/document/d/1bKaOw2mJPJDYhgFQGGVOyBB3M4Bm_Q3PMrfQeqeYi0M (Google internal only). |
| |
| Q2: What does "SAE" mean? |
| |
| A2: Sparse Autoencoder. See https://docs.google.com/document/d/1roMgCPMPEQgaNbCu15CGo966xRLToulCBQUVKVGvcfM (should be available to trusted HuggingFace collaborators, and Google too). |
| |
| TODO(conmy): remove this when making the main repo. |
| |
| ## 4. Point of Contact |
| |
| Point of contact: Arthur Conmy |
| |
| Contact by email: |
| |
| ```python |
| ''.join(list('moc.elgoog@ymnoc')[::-1]) |
| ``` |
| |
| HuggingFace account: |
| https://huggingface.co/ArthurConmyGDM |
| |