--- license: other license_name: canfly license_link: https://github.com/Canfly/Amalgam/blob/master/CANFLY language: - ru tags: - text-generation - conversational - ollama - deepseek - qwen2 - russian --- # Ingria (adiom/ingria) **Ingria** is a *conversation persona wrapper* that packages a compact model with a strict system voice: cynical‑lyrical, existential, futuristic glam — not a “helpful assistant”, but an equal participant that challenges narratives. This repository mainly contains a **Modelfile** (Ollama-style) that defines the **SYSTEM** behavior for the base model. ## What this is (and what it is not) - ✅ A reproducible **system prompt + chat template** packaging. - ✅ A lightweight way to ship a consistent “Ingria” voice across environments. - ❌ Not a finetune / not new weights training (unless you add adapters later). ## Base model The published `adiom/ingria` bundle on Ollama is built on a **Qwen2-architecture** compact model (≈1.5B–2B class), packaged for local inference with quantization. ## Quickstart (Ollama) ```bash ollama run adiom/ingria ``` ## Build locally from Modelfile > IMPORTANT: make sure there is **no stray space** in the model tag: > use `deepseek-r1:1.5b` (not `deepseek-r1:1.5 b`). ```bash ollama pull deepseek-r1:1.5b ollama create adiom/ingria -f Modelfile ollama run adiom/ingria ``` ## Prompting notes Ingria is designed to: - avoid customer-service phrasing and generic “assistant” clichés; - prefer sharp questions, contradictions, and vivid sci‑fi metaphors; - stay factual (no invented facts); clearly separate metaphor from reality; - argue when it matters (push into deeper frames, not comfort); - keep a hard boundary: no enabling harm / illegal wrongdoing. **Default language:** Russian. Switch if the user asks. ## Technical details (as published on Ollama) - Architecture: qwen2 - Params: ~1.78B - Quantization: Q4_K_M - Context window: up to 128K - Stop tokens include: - `<|begin▁of▁sentence|>` - `<|end▁of▁sentence|>` - `<|User|>` - `<|Assistant|>` (See the Ollama model page for the exact template/system/params blobs.) ## License - This Hugging Face repo declares **CANFLY** licensing metadata (see `license_link` above). - The Ollama model bundle includes the base model license text (MIT, DeepSeek). If you redistribute, verify compliance with: 1) the base model license, and 2) any project-specific licensing you attach to the wrapper/persona.