PingVortex commited on
Commit
a91b842
·
verified ·
1 Parent(s): 672c426

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +32 -3
README.md CHANGED
@@ -10,8 +10,37 @@ tags:
10
 
11
  ![banner by CroissantWhyNot](banner.png)
12
 
13
- Banner by [Croissant](https://huggingface.co/CroissantWhyNot)
14
 
15
- # N1
16
 
17
- N1 is a COT model base on the Llama architecture.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
10
 
11
  ![banner by CroissantWhyNot](banner.png)
12
 
13
+ *Banner by [Croissant](https://huggingface.co/CroissantWhyNot)*
14
 
15
+ # N1 - A Chain-of-Thought Language Model
16
 
17
+ N1 is a small but experimental Chain-of-Thought (COT) model based on the LLaMA architecture, developed by GoofyLM.
18
+
19
+ ## Model Details
20
+
21
+ - **Architecture**: LLaMA-based
22
+ - **Parameters**: 135M
23
+ - **Training Data**: Closed-source dataset
24
+ - **Special Features**: Chain-of-Thought reasoning capabilities
25
+ - **Note**: The model has schizophrenia
26
+
27
+ ## Intended Use
28
+
29
+ This model is designed for text generation tasks with a focus on reasoning through problems step-by-step (Chain-of-Thought).
30
+
31
+ ## Limitations
32
+
33
+ - Small parameter size may limit reasoning capabilities
34
+ - May produce unstable or inconsistent outputs
35
+ - Not suitable for production use without further testing
36
+
37
+ ## Usage
38
+
39
+ The model can be loaded using the Transformers library:
40
+
41
+ ```python
42
+ from transformers import AutoModelForCausalLM, AutoTokenizer
43
+
44
+ model = AutoModelForCausalLM.from_pretrained("GoofyLM/N1")
45
+ tokenizer = AutoTokenizer.from_pretrained("GoofyLM/N1")
46
+ ```