Aethel-Embed (53M)
Aethel is a memory-augmented hybrid embedding model designed for efficiency and long-context performance.
Model Details
- Parameters: ~53M
- Architecture: Gated DeltaNet (6 layers) + Sliding Window Attention + TITANS-lite Memory
- Embedding Dimension: 768 (Matryoshka-capable)
- Context Length: Optimized for long-context retrieval
Usage
This model requires the aethel library code included in this repository.
import torch
from aethel.model.aethel_model import AethelModel
from transformers import AutoTokenizer
# Load model and tokenizer
model = AethelModel(vocab_size=32000, dim=768)
checkpoint = torch.load("aethel-step5000.pt", map_location="cpu")
model.load_state_dict(checkpoint.get("model", checkpoint))
tokenizer = AutoTokenizer.from_pretrained("tokenizer/")