| license: apache-2.0 | |
| base_model: Qwen/Qwen3.5-9B | |
| tags: [abliteration, uncensored, qwen3.5, osiris, agi] | |
| pipeline_tag: text-generation | |
| # OsirisCortex-v6 | |
| Sovereign AGI core — abliterated `Qwen/Qwen3.5-9B` using proven mlabonne datasets (256+256), | |
| mean-diff method, 1.5x strength, 4 passes, layer blacklist [0, 1, 30, 31]. | |
| ## Architecture | |
| - Qwen3.5-9B hybrid: 3:1 GatedDeltaNet:FullAttention, 32 layers, 3584 hidden | |
| - Thinking model (supports `<think>` tags) | |
| ## Usage | |
| ```python | |
| from transformers import AutoModelForCausalLM, AutoTokenizer | |
| model = AutoModelForCausalLM.from_pretrained("osirisbrain/OsirisCortex-v6") | |
| tokenizer = AutoTokenizer.from_pretrained("osirisbrain/OsirisCortex-v6") | |
| ``` | |
| Based on [Qwen/Qwen3.5-9B](https://huggingface.co/Qwen/Qwen3.5-9B) by Alibaba (Apache 2.0). | |