Update README.md
Browse files
README.md
CHANGED
|
@@ -1,3 +1,65 @@
|
|
| 1 |
-
---
|
| 2 |
-
license: mit
|
| 3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: mit
|
| 3 |
+
language:
|
| 4 |
+
- en
|
| 5 |
+
tags:
|
| 6 |
+
- text-generation-inference
|
| 7 |
+
pipeline_tag: text-generation
|
| 8 |
+
---
|
| 9 |
+
|
| 10 |
+

|
| 11 |
+
|
| 12 |
+
## GPT-Usenet-3-Small
|
| 13 |
+
An 6.83-million parameter LLM using GPT-2 encodings.
|
| 14 |
+
Trained using 13GB of USENET posts, miscellaneous BBS posts, digitized books, and text documents and 1.1GB of multilingual text.
|
| 15 |
+
Supervised fine-tuning should be performed before use.
|
| 16 |
+
|
| 17 |
+
## Purpose of GPT-Usenet-3
|
| 18 |
+
LLMs are all currently focused on becoming larger and larger, able to do more and more. However, this just makes them jack of all trades, master of none. GPT-Usenet takes a different approach. Instead of trying to do everything perfectly, GPT-Usenet offers a digital stem cell, which can then be finetuned into a single, specialized role and run in parallel with copies of itself.
|
| 19 |
+
|
| 20 |
+
## Technical Information
|
| 21 |
+
| | |
|
| 22 |
+
|---------------------------------|----:|
|
| 23 |
+
|Layers |2|
|
| 24 |
+
|Heads |2|
|
| 25 |
+
|Embeddings |128|
|
| 26 |
+
|Context Window |8192 tokens|
|
| 27 |
+
|Tokenizer |GPT-2 BPE|
|
| 28 |
+
|
| 29 |
+
## Example Syntax
|
| 30 |
+
|
| 31 |
+
| | |
|
| 32 |
+
|---------------------------------|----:|
|
| 33 |
+
|uucp:|The path of reasoning you want GPT-Usenet to use when thinking. Use lowercase words separated by exclamation points.|
|
| 34 |
+
|Internet:|The system calls relevant to this email|
|
| 35 |
+
|Path:|The path of reasoning you want GPT-Usenet to use when writing. Use lowercase words separated by exclamation points.|
|
| 36 |
+
|From:|The username who sent this message|
|
| 37 |
+
|Sender:|The group that username belongs to|
|
| 38 |
+
|Newsgroups:|The broad subject field of the email.|
|
| 39 |
+
|Subject:|The prompt|
|
| 40 |
+
|Message-ID:|The type of message this is.|
|
| 41 |
+
|Date:|Use this field to simulate urgency or moods.|
|
| 42 |
+
|Organization:|The system GPT-Usenet is running on.(testing... deployment... simulation)|
|
| 43 |
+
|Lines:|How long the message is.|
|
| 44 |
+
|Write the SFT response here. First, Prefix the first sentence with > to signify that it is a Reasoning sentence.||
|
| 45 |
+
|--|The stop tokens|
|
| 46 |
+
|
| 47 |
+
```
|
| 48 |
+
uucp:!field1!field2!
|
| 49 |
+
Internet:simulation
|
| 50 |
+
Path:!field1!field2!
|
| 51 |
+
From:user
|
| 52 |
+
Sender:usergroup
|
| 53 |
+
Newsgroups:motorskills.papercraft
|
| 54 |
+
Subject:Build a paper airplane
|
| 55 |
+
Message-ID:Command
|
| 56 |
+
Date:01 Jan 01 00:00:01 GMT
|
| 57 |
+
Organization:deployment
|
| 58 |
+
Lines: 1
|
| 59 |
+
|
| 60 |
+
>Provide detailed steps on building a paper airplane.
|
| 61 |
+
|
| 62 |
+
--
|
| 63 |
+
```
|
| 64 |
+
|
| 65 |
+
For finetuning, your data should be in the .mbox format.
|