Instructions to use decoderesearch/olmo-3-saes with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- SAELens
How to use decoderesearch/olmo-3-saes with SAELens:
# pip install sae-lens from sae_lens import SAE sae, cfg_dict, sparsity = SAE.from_pretrained( release = "RELEASE_ID", # e.g., "gpt2-small-res-jb". See other options in https://github.com/jbloomAus/SAELens/blob/main/sae_lens/pretrained_saes.yaml sae_id = "SAE_ID", # e.g., "blocks.8.hook_resid_pre". Won't always be a hook point ) - Notebooks
- Google Colab
- Kaggle
Olmo-3 SAEs for use with the SAELens library
This repository contains the following SAEs:
Olmo-3-1025-7b
- olmo-3-1025-7b/btk-mat-layer-4-k-100
- olmo-3-1025-7b/btk-mat-layer-16-k-100
- olmo-3-1025-7b/btk-mat-layer-28-k-100
Load these SAEs using SAELens as below:
from sae_lens import SAE
sae = SAE.from_pretrained("decoderesearch/olmo-3-saes", "<sae_id>")
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support