Aethel-Embed (53M)

Aethel is a memory-augmented hybrid embedding model designed for efficiency and long-context performance.

Model Details

  • Parameters: ~53M
  • Architecture: Gated DeltaNet (6 layers) + Sliding Window Attention + TITANS-lite Memory
  • Embedding Dimension: 768 (Matryoshka-capable)
  • Context Length: Optimized for long-context retrieval

Usage

This model requires the aethel library code included in this repository.

import torch
from aethel.model.aethel_model import AethelModel
from transformers import AutoTokenizer

# Load model and tokenizer
model = AethelModel(vocab_size=32000, dim=768)
checkpoint = torch.load("aethel-step5000.pt", map_location="cpu")
model.load_state_dict(checkpoint.get("model", checkpoint))

tokenizer = AutoTokenizer.from_pretrained("tokenizer/")
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Space using aryan2302/Aethel-Embed-53M 1