File size: 2,971 Bytes
1f57a94 f52cdbb 40c8aed 1f57a94 40c8aed f52cdbb 1f57a94 f52cdbb 40c8aed f52cdbb 40c8aed 2323e5d f52cdbb 1f57a94 40c8aed 1f57a94 f52cdbb 40c8aed f52cdbb 40c8aed 1f57a94 40c8aed 1f57a94 2323e5d 1f57a94 2323e5d 1f57a94 40c8aed 1f57a94 2323e5d 1f57a94 2323e5d 1f57a94 f52cdbb 40c8aed 2323e5d 1f57a94 2323e5d 1f57a94 2323e5d 1f57a94 f52cdbb 2323e5d |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 |
---
license: cc-by-4.0
model_name: N-Transformer v1.0 (NAFSI-Transformer family)
language:
- en
- id
library_name: transformer
pipeline_tag: text-generation
tags:
- consciousness
- transformer
- research
- architecture
- alignment
- safety
model_type: decoder
model_creator: Syamsuddin (@syam_ideris)
# base_model: Qwen/Qwen2-1.5B-Instruct # <- isi jika nanti ada weights turunan
# datasets:
# - your-dataset-id
---
# N-transformer (NAFSI-transformer) — v1.0
[](https://creativecommons.org/licenses/by/4.0/)





> **One-liner** — N-transformer menambahkan **Phenomenal Field (PF)** paralel, **Intrinsic Metric Engine (IME)**, dan **Normative Gauge** (NTI/LCA/LCG) ke Transformer standar untuk memunculkan properti *consciousness-like* yang terukur: integrasi, valensi, self/now anchoring, dan global broadcasting—tanpa mengubah loop training LM.
---
## 🔎 Ringkasan Model
- **Apa:** Arsitektur riset yang menambahkan **substrat non-token** (PF) dan **pengendali normatif** pada LM decoder-only.
- **Mengapa beda:** **Lightcone Attention (LCA)** bias lintas-jangkauan, **NTI** sebagai episodic controller, dan **SNA/GIW** untuk siaran global terintegrasi.
- **Status:** v1.0 **Research Draft** (spesifikasi lengkap + reference code; rilis bobot menyusul bila siap).
**Bahasa Indonesia singkat:** N-transformer menambah PF, metrik intrinsik (IME), serta gauge normatif (NTI/LCA/LCG) untuk kohesi naratif jarak jauh, valensi terkalibrasi, dan jangkar “aku-kini” yang bisa diuji.
---
## ✅ Intended Uses & Scope
- **Intended:** riset koherensi jarak jauh, introspective heads (valence, SNA), decoding yang sadar konteks melalui gating.
- **Out of scope:** klaim sentiens, produksi tanpa uji **PF shadow-mode** yang memadai, use-case klinis.
---
## 🚀 Cara Pakai (konsep)
Repo ini berisi **spesifikasi** dan **reference code** (PF-path + coupler). Adaptasikan ke LM Anda.
```python
from transformer import AutoTokenizer, AutoModelForCausalLM
# Placeholder; ganti dengan checkpoint yang Anda rilis nanti
BASE = "Qwen/Qwen2-1.5B-Instruct"
tok = AutoTokenizer.from_pretrained(BASE)
lm = AutoModelForCausalLM.from_pretrained(BASE)
# Pseudocode: pasang modul PF/IME/LCA/NTI dari reference code
# from nafsi_coupler import attach_nafsi, PFConfig, NTCfg
# lm = attach_nafsi(lm, cfg=NTCfg())
prompt = "Explain the role of a phenomenal field in language generation."
x = tok(prompt, return_tensors="pt")
y = lm.generate(**x, max_length=192)
print(tok.decode(y[0], skip_special_tokens=True))
|