Metis-1.4 Chat

Metis-1.4 Chat is the chat-tuned release from Lernex's Metis-1.4 research run: a compact ~500M-parameter MoR-style language model built as a step toward Lernex's efficient "super stack" model line.

This checkpoint is the Chat SFT model, exported after base pretraining, sequence-MoR continued pretraining, and the chat supervised fine-tuning stage. Reward modeling and DPO were intentionally skipped for this release to preserve time and cost after the chat and think SFT checkpoints were complete.

Metis-1.4 is a small research model. Expect legible but limited behavior, especially on math, code, and deep reasoning. The point of this release is transparency: showing the working state of the Metis line and preserving the exact artifacts for benchmarking and iteration.

Files

  • model.safetensors
  • config.json
  • generation_config.json
  • tokenizer.json
  • tokenizer_config.json
  • special_tokens_map.json

Intended Use

Use this model for lightweight chat probing, small-model behavior experiments, tutoring-style prompt tests, and comparisons against the Think SFT variant. It is not a production assistant and should not be treated as a reliable factual system.

Release Note

This is the corrected Metis-1.4 Chat SFT release, part of Lernex's ongoing work toward highly efficient compact reasoning and learning models.

Downloads last month
28
Safetensors
Model size
0.5B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support