Syamsuddin commited on
Commit
40c8aed
·
verified ·
1 Parent(s): 2323e5d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -10
README.md CHANGED
@@ -1,35 +1,35 @@
1
  ---
2
  license: cc-by-4.0
3
- model_name: N-Transformers v1.0 (NAFSI-Transformers family)
4
  language:
5
  - en
6
  - id
7
- library_name: transformers
8
  pipeline_tag: text-generation
9
  tags:
10
  - consciousness
11
- - transformers
12
  - research
13
  - architecture
14
  - alignment
15
  - safety
16
  model_type: decoder
17
- model_creator: Syamsuddin (@syam_ideris) & Prometheus
18
  # base_model: Qwen/Qwen2-1.5B-Instruct # <- isi jika nanti ada weights turunan
19
  # datasets:
20
  # - your-dataset-id
21
  ---
22
 
23
- # N-Transformers (NAFSI-Transformers) — v1.0
24
 
25
  [![License: CC BY 4.0](https://img.shields.io/badge/License-CC%20BY%204.0-blue.svg)](https://creativecommons.org/licenses/by/4.0/)
26
  ![Status](https://img.shields.io/badge/Status-Research%20Draft-ffa500)
27
- ![Transformers](https://img.shields.io/badge/Transformers-%E2%89%A5%204.42-0f7)
28
  ![Python](https://img.shields.io/badge/Python-3.10%2B-informational)
29
  ![PRs](https://img.shields.io/badge/PRs-welcome-brightgreen)
30
- ![Topics](https://img.shields.io/badge/topic-transformers%20%7C%20architecture%20%7C%20alignment-6f42c1)
31
 
32
- > **One-liner** — N-Transformers menambahkan **Phenomenal Field (PF)** paralel, **Intrinsic Metric Engine (IME)**, dan **Normative Gauge** (NTI/LCA/LCG) ke Transformer standar untuk memunculkan properti *consciousness-like* yang terukur: integrasi, valensi, self/now anchoring, dan global broadcasting—tanpa mengubah loop training LM.
33
 
34
  ---
35
 
@@ -39,7 +39,7 @@ model_creator: Syamsuddin (@syam_ideris) & Prometheus
39
  - **Mengapa beda:** **Lightcone Attention (LCA)** bias lintas-jangkauan, **NTI** sebagai episodic controller, dan **SNA/GIW** untuk siaran global terintegrasi.
40
  - **Status:** v1.0 **Research Draft** (spesifikasi lengkap + reference code; rilis bobot menyusul bila siap).
41
 
42
- **Bahasa Indonesia singkat:** N-Transformers menambah PF, metrik intrinsik (IME), serta gauge normatif (NTI/LCA/LCG) untuk kohesi naratif jarak jauh, valensi terkalibrasi, dan jangkar “aku-kini” yang bisa diuji.
43
 
44
  ---
45
 
@@ -53,7 +53,7 @@ model_creator: Syamsuddin (@syam_ideris) & Prometheus
53
  Repo ini berisi **spesifikasi** dan **reference code** (PF-path + coupler). Adaptasikan ke LM Anda.
54
 
55
  ```python
56
- from transformers import AutoTokenizer, AutoModelForCausalLM
57
  # Placeholder; ganti dengan checkpoint yang Anda rilis nanti
58
  BASE = "Qwen/Qwen2-1.5B-Instruct"
59
 
 
1
  ---
2
  license: cc-by-4.0
3
+ model_name: N-Transformer v1.0 (NAFSI-Transformer family)
4
  language:
5
  - en
6
  - id
7
+ library_name: transformer
8
  pipeline_tag: text-generation
9
  tags:
10
  - consciousness
11
+ - transformer
12
  - research
13
  - architecture
14
  - alignment
15
  - safety
16
  model_type: decoder
17
+ model_creator: Syamsuddin (@syam_ideris)
18
  # base_model: Qwen/Qwen2-1.5B-Instruct # <- isi jika nanti ada weights turunan
19
  # datasets:
20
  # - your-dataset-id
21
  ---
22
 
23
+ # N-transformer (NAFSI-transformer) — v1.0
24
 
25
  [![License: CC BY 4.0](https://img.shields.io/badge/License-CC%20BY%204.0-blue.svg)](https://creativecommons.org/licenses/by/4.0/)
26
  ![Status](https://img.shields.io/badge/Status-Research%20Draft-ffa500)
27
+ ![transformer](https://img.shields.io/badge/transformer-%E2%89%A5%204.42-0f7)
28
  ![Python](https://img.shields.io/badge/Python-3.10%2B-informational)
29
  ![PRs](https://img.shields.io/badge/PRs-welcome-brightgreen)
30
+ ![Topics](https://img.shields.io/badge/topic-transformer%20%7C%20architecture%20%7C%20alignment-6f42c1)
31
 
32
+ > **One-liner** — N-transformer menambahkan **Phenomenal Field (PF)** paralel, **Intrinsic Metric Engine (IME)**, dan **Normative Gauge** (NTI/LCA/LCG) ke Transformer standar untuk memunculkan properti *consciousness-like* yang terukur: integrasi, valensi, self/now anchoring, dan global broadcasting—tanpa mengubah loop training LM.
33
 
34
  ---
35
 
 
39
  - **Mengapa beda:** **Lightcone Attention (LCA)** bias lintas-jangkauan, **NTI** sebagai episodic controller, dan **SNA/GIW** untuk siaran global terintegrasi.
40
  - **Status:** v1.0 **Research Draft** (spesifikasi lengkap + reference code; rilis bobot menyusul bila siap).
41
 
42
+ **Bahasa Indonesia singkat:** N-transformer menambah PF, metrik intrinsik (IME), serta gauge normatif (NTI/LCA/LCG) untuk kohesi naratif jarak jauh, valensi terkalibrasi, dan jangkar “aku-kini” yang bisa diuji.
43
 
44
  ---
45
 
 
53
  Repo ini berisi **spesifikasi** dan **reference code** (PF-path + coupler). Adaptasikan ke LM Anda.
54
 
55
  ```python
56
+ from transformer import AutoTokenizer, AutoModelForCausalLM
57
  # Placeholder; ganti dengan checkpoint yang Anda rilis nanti
58
  BASE = "Qwen/Qwen2-1.5B-Instruct"
59