ποΈ THE ARCHITECT: GHOST CODEX OMEGA V2.5
Project Classification: BLACK SITE / ZERO-COST FORTRESS
This repository serves as the central hub for the Architect Ecosystem. It is a high-fidelity LaTeX workstation designed for secure, zero-cost architectural drafting and technical documentation.
π οΈ System Architecture
The project is split into three distinct, synchronized layers:
- Logic Tier (The Brain): A fine-tuned GGUF Model running locally via Ollama to ensure 100% prompt privacy and zero-latency inference.
- Forge Tier (The Interface): A supreme CustomTkinter GUI (.exe) featuring real-time streaming, split-pane code monitoring, and project archiving.
- Compiler Tier (The Factory): A private Docker-based TeX Live Space hosted on Hugging Face that handles heavy PDF rendering without requiring local installations.
π Omega Protocol Features
- Live Forge Monitor: Real-time streaming of LaTeX logic directly into the UI.
- Style Protocol Injection: Dropdown presets that automatically configure the AI for TikZ, PGFPlots, or Scientific formatting.
- Full Project Archiving: One-click export of the raw instructions (.txt), the source code (.tex), and the final render (.pdf).
- Zero-Weight Portability: The standalone executable requires no local LaTeX distribution (MiKTeX/TeX Live) to function.
π Deployment Instructions
Local PC Setup
- Download the GGUF Model from ShiroOnigami23/THE_ARCHITECT_V2_FINAL.
- Initialize the model in Ollama:
ollama create architect-v2 -f Modelfile. - Run the Architect_Omega.exe.
Cloud Infrastructure
- Model Engine: THE_ARCHITECT_V2_FINAL
- Training Ground: ARCHITECT-HIGH-FID-100K
- Compiler Core: LATEX-COMPILER-CORE
Security Rating: 9.5/10 (Post-Quantum & Anti-Intelligence Resistant) Developed by: ShiroOnigami23
- Downloads last month
- 10
Hardware compatibility
Log In
to view the estimation
4-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support