Update README.md
Browse files
README.md
CHANGED
|
@@ -1,35 +1,35 @@
|
|
| 1 |
---
|
| 2 |
license: cc-by-4.0
|
| 3 |
-
model_name: N-
|
| 4 |
language:
|
| 5 |
- en
|
| 6 |
- id
|
| 7 |
-
library_name:
|
| 8 |
pipeline_tag: text-generation
|
| 9 |
tags:
|
| 10 |
- consciousness
|
| 11 |
-
-
|
| 12 |
- research
|
| 13 |
- architecture
|
| 14 |
- alignment
|
| 15 |
- safety
|
| 16 |
model_type: decoder
|
| 17 |
-
model_creator: Syamsuddin (@syam_ideris)
|
| 18 |
# base_model: Qwen/Qwen2-1.5B-Instruct # <- isi jika nanti ada weights turunan
|
| 19 |
# datasets:
|
| 20 |
# - your-dataset-id
|
| 21 |
---
|
| 22 |
|
| 23 |
-
# N-
|
| 24 |
|
| 25 |
[](https://creativecommons.org/licenses/by/4.0/)
|
| 26 |

|
| 27 |
-

|
| 29 |

|
| 30 |
-
 & Prometheus
|
|
| 39 |
- **Mengapa beda:** **Lightcone Attention (LCA)** bias lintas-jangkauan, **NTI** sebagai episodic controller, dan **SNA/GIW** untuk siaran global terintegrasi.
|
| 40 |
- **Status:** v1.0 **Research Draft** (spesifikasi lengkap + reference code; rilis bobot menyusul bila siap).
|
| 41 |
|
| 42 |
-
**Bahasa Indonesia singkat:** N-
|
| 43 |
|
| 44 |
---
|
| 45 |
|
|
@@ -53,7 +53,7 @@ model_creator: Syamsuddin (@syam_ideris) & Prometheus
|
|
| 53 |
Repo ini berisi **spesifikasi** dan **reference code** (PF-path + coupler). Adaptasikan ke LM Anda.
|
| 54 |
|
| 55 |
```python
|
| 56 |
-
from
|
| 57 |
# Placeholder; ganti dengan checkpoint yang Anda rilis nanti
|
| 58 |
BASE = "Qwen/Qwen2-1.5B-Instruct"
|
| 59 |
|
|
|
|
| 1 |
---
|
| 2 |
license: cc-by-4.0
|
| 3 |
+
model_name: N-Transformer v1.0 (NAFSI-Transformer family)
|
| 4 |
language:
|
| 5 |
- en
|
| 6 |
- id
|
| 7 |
+
library_name: transformer
|
| 8 |
pipeline_tag: text-generation
|
| 9 |
tags:
|
| 10 |
- consciousness
|
| 11 |
+
- transformer
|
| 12 |
- research
|
| 13 |
- architecture
|
| 14 |
- alignment
|
| 15 |
- safety
|
| 16 |
model_type: decoder
|
| 17 |
+
model_creator: Syamsuddin (@syam_ideris)
|
| 18 |
# base_model: Qwen/Qwen2-1.5B-Instruct # <- isi jika nanti ada weights turunan
|
| 19 |
# datasets:
|
| 20 |
# - your-dataset-id
|
| 21 |
---
|
| 22 |
|
| 23 |
+
# N-transformer (NAFSI-transformer) — v1.0
|
| 24 |
|
| 25 |
[](https://creativecommons.org/licenses/by/4.0/)
|
| 26 |

|
| 27 |
+

|
| 28 |

|
| 29 |

|
| 30 |
+

|
| 31 |
|
| 32 |
+
> **One-liner** — N-transformer menambahkan **Phenomenal Field (PF)** paralel, **Intrinsic Metric Engine (IME)**, dan **Normative Gauge** (NTI/LCA/LCG) ke Transformer standar untuk memunculkan properti *consciousness-like* yang terukur: integrasi, valensi, self/now anchoring, dan global broadcasting—tanpa mengubah loop training LM.
|
| 33 |
|
| 34 |
---
|
| 35 |
|
|
|
|
| 39 |
- **Mengapa beda:** **Lightcone Attention (LCA)** bias lintas-jangkauan, **NTI** sebagai episodic controller, dan **SNA/GIW** untuk siaran global terintegrasi.
|
| 40 |
- **Status:** v1.0 **Research Draft** (spesifikasi lengkap + reference code; rilis bobot menyusul bila siap).
|
| 41 |
|
| 42 |
+
**Bahasa Indonesia singkat:** N-transformer menambah PF, metrik intrinsik (IME), serta gauge normatif (NTI/LCA/LCG) untuk kohesi naratif jarak jauh, valensi terkalibrasi, dan jangkar “aku-kini” yang bisa diuji.
|
| 43 |
|
| 44 |
---
|
| 45 |
|
|
|
|
| 53 |
Repo ini berisi **spesifikasi** dan **reference code** (PF-path + coupler). Adaptasikan ke LM Anda.
|
| 54 |
|
| 55 |
```python
|
| 56 |
+
from transformer import AutoTokenizer, AutoModelForCausalLM
|
| 57 |
# Placeholder; ganti dengan checkpoint yang Anda rilis nanti
|
| 58 |
BASE = "Qwen/Qwen2-1.5B-Instruct"
|
| 59 |
|