Superchat 35B-A3B

Sovereign AI. On your machine. Zero cloud.

Overview

Superchat is a 35B parameter AI model (3B active per token via MoE) with:

  • Tool calling — Read/write files, run commands, edit code
  • 1M token context — Natively, extensible to 10M+ via disk retrieval
  • 201 languages — Including 11 Indian languages
  • Claude-level quality — Distilled from Claude Opus 4.6 via Chimere LoRA
  • Runs on laptops — Only 3B params active, fits in 16GB RAM

Architecture

Property Value
Total Parameters 35 billion
Active per Token 3 billion (MoE sparse)
Context Window 1,000,000 tokens
Languages 201
Base Model Qwen3.5-35B-A3B

LoRA Stack

  1. Chimere — Distilled from Claude Opus 4.6 (tool-calling, agentic reasoning)
  2. CLI Agent — Terminal workflows, ML operations
  3. Superchat Identity — Indian languages, custom knowledge, branding
  4. Finance — Financial chain-of-thought reasoning
  5. Vision/OCR — Document understanding
  6. Deep Thinking — Enhanced reasoning via Fragmented Training

Indian Languages

Hindi, Tamil, Telugu, Malayalam, Kannada, Bengali, Marathi, Gujarati, Punjabi, Odia, Assamese, Urdu, Sanskrit, Nepali

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("christud/superchat-35b-a3b")
tokenizer = AutoTokenizer.from_pretrained("christud/superchat-35b-a3b")

messages = [
    {"role": "system", "content": "You are Superchat, a sovereign AI by Christudas Philipose."},
    {"role": "user", "content": "Hello! What can you do?"},
]

Creator

Christudas Philipose superchat.in Made in India, for the world.

License

Apache 2.0

Downloads last month
-
Safetensors
Model size
35B params
Tensor type
F32
·
BF16
·
U8
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support