zakarth commited on
Commit
df78e06
·
verified ·
1 Parent(s): 3d20d25

Upload folder using huggingface_hub

Browse files
Files changed (11) hide show
  1. .gitattributes +1 -0
  2. LICENSE +121 -0
  3. NOTICE +1 -0
  4. README.md +157 -0
  5. config.json +32 -0
  6. generation_config.json +10 -0
  7. model.safetensors +3 -0
  8. special_tokens_map.json +46 -0
  9. tokenizer.json +0 -0
  10. tokenizer_config.json +164 -0
  11. violet.png +3 -0
.gitattributes CHANGED
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ violet.png filter=lfs diff=lfs merge=lfs -text
LICENSE ADDED
@@ -0,0 +1,121 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Creative Commons Legal Code
2
+
3
+ CC0 1.0 Universal
4
+
5
+ CREATIVE COMMONS CORPORATION IS NOT A LAW FIRM AND DOES NOT PROVIDE
6
+ LEGAL SERVICES. DISTRIBUTION OF THIS DOCUMENT DOES NOT CREATE AN
7
+ ATTORNEY-CLIENT RELATIONSHIP. CREATIVE COMMONS PROVIDES THIS
8
+ INFORMATION ON AN "AS-IS" BASIS. CREATIVE COMMONS MAKES NO WARRANTIES
9
+ REGARDING THE USE OF THIS DOCUMENT OR THE INFORMATION OR WORKS
10
+ PROVIDED HEREUNDER, AND DISCLAIMS LIABILITY FOR DAMAGES RESULTING FROM
11
+ THE USE OF THIS DOCUMENT OR THE INFORMATION OR WORKS PROVIDED
12
+ HEREUNDER.
13
+
14
+ Statement of Purpose
15
+
16
+ The laws of most jurisdictions throughout the world automatically confer
17
+ exclusive Copyright and Related Rights (defined below) upon the creator
18
+ and subsequent owner(s) (each and all, an "owner") of an original work of
19
+ authorship and/or a database (each, a "Work").
20
+
21
+ Certain owners wish to permanently relinquish those rights to a Work for
22
+ the purpose of contributing to a commons of creative, cultural and
23
+ scientific works ("Commons") that the public can reliably and without fear
24
+ of later claims of infringement build upon, modify, incorporate in other
25
+ works, reuse and redistribute as freely as possible in any form whatsoever
26
+ and for any purposes, including without limitation commercial purposes.
27
+ These owners may contribute to the Commons to promote the ideal of a free
28
+ culture and the further production of creative, cultural and scientific
29
+ works, or to gain reputation or greater distribution for their Work in
30
+ part through the use and efforts of others.
31
+
32
+ For these and/or other purposes and motivations, and without any
33
+ expectation of additional consideration or compensation, the person
34
+ associating CC0 with a Work (the "Affirmer"), to the extent that he or she
35
+ is an owner of Copyright and Related Rights in the Work, voluntarily
36
+ elects to apply CC0 to the Work and publicly distribute the Work under its
37
+ terms, with knowledge of his or her Copyright and Related Rights in the
38
+ Work and the meaning and intended legal effect of CC0 on those rights.
39
+
40
+ 1. Copyright and Related Rights. A Work made available under CC0 may be
41
+ protected by copyright and related or neighboring rights ("Copyright and
42
+ Related Rights"). Copyright and Related Rights include, but are not
43
+ limited to, the following:
44
+
45
+ i. the right to reproduce, adapt, distribute, perform, display,
46
+ communicate, and translate a Work;
47
+ ii. moral rights retained by the original author(s) and/or performer(s);
48
+ iii. publicity and privacy rights pertaining to a person's image or
49
+ likeness depicted in a Work;
50
+ iv. rights protecting against unfair competition in regards to a Work,
51
+ subject to the limitations in paragraph 4(a), below;
52
+ v. rights protecting the extraction, dissemination, use and reuse of data
53
+ in a Work;
54
+ vi. database rights (such as those arising under Directive 96/9/EC of the
55
+ European Parliament and of the Council of 11 March 1996 on the legal
56
+ protection of databases, and under any national implementation
57
+ thereof, including any amended or successor version of such
58
+ directive); and
59
+ vii. other similar, equivalent or corresponding rights throughout the
60
+ world based on applicable law or treaty, and any national
61
+ implementations thereof.
62
+
63
+ 2. Waiver. To the greatest extent permitted by, but not in contravention
64
+ of, applicable law, Affirmer hereby overtly, fully, permanently,
65
+ irrevocably and unconditionally waives, abandons, and surrenders all of
66
+ Affirmer's Copyright and Related Rights and associated claims and causes
67
+ of action, whether now known or unknown (including existing as well as
68
+ future claims and causes of action), in the Work (i) in all territories
69
+ worldwide, (ii) for the maximum duration provided by applicable law or
70
+ treaty (including future time extensions), (iii) in any current or future
71
+ medium and for any number of copies, and (iv) for any purpose whatsoever,
72
+ including without limitation commercial, advertising or promotional
73
+ purposes (the "Waiver"). Affirmer makes the Waiver for the benefit of each
74
+ member of the public at large and to the detriment of Affirmer's heirs and
75
+ successors, fully intending that such Waiver shall not be subject to
76
+ revocation, rescission, cancellation, termination, or any other legal or
77
+ equitable action to disrupt the quiet enjoyment of the Work by the public
78
+ as contemplated by Affirmer's express Statement of Purpose.
79
+
80
+ 3. Public License Fallback. Should any part of the Waiver for any reason
81
+ be judged legally invalid or ineffective under applicable law, then the
82
+ Waiver shall be preserved to the maximum extent permitted taking into
83
+ account Affirmer's express Statement of Purpose. In addition, to the
84
+ extent the Waiver is so judged Affirmer hereby grants to each affected
85
+ person a royalty-free, non transferable, non sublicensable, non exclusive,
86
+ irrevocable and unconditional license to exercise Affirmer's Copyright and
87
+ Related Rights in the Work (i) in all territories worldwide, (ii) for the
88
+ maximum duration provided by applicable law or treaty (including future
89
+ time extensions), (iii) in any current or future medium and for any number
90
+ of copies, and (iv) for any purpose whatsoever, including without
91
+ limitation commercial, advertising or promotional purposes (the
92
+ "License"). The License shall be deemed effective as of the date CC0 was
93
+ applied by Affirmer to the Work. Should any part of the License for any
94
+ reason be judged legally invalid or ineffective under applicable law, such
95
+ partial invalidity or ineffectiveness shall not invalidate the remainder
96
+ of the License, and in such case Affirmer hereby affirms that he or she
97
+ will not (i) exercise any of his or her remaining Copyright and Related
98
+ Rights in the Work or (ii) assert any associated claims and causes of
99
+ action with respect to the Work, in either case contrary to Affirmer's
100
+ express Statement of Purpose.
101
+
102
+ 4. Limitations and Disclaimers.
103
+
104
+ a. No trademark or patent rights held by Affirmer are waived, abandoned,
105
+ surrendered, licensed or otherwise affected by this document.
106
+ b. Affirmer offers the Work as-is and makes no representations or
107
+ warranties of any kind concerning the Work, express, implied,
108
+ statutory or otherwise, including without limitation warranties of
109
+ title, merchantability, fitness for a particular purpose, non
110
+ infringement, or the absence of latent or other defects, accuracy, or
111
+ the present or absence of errors, whether or not discoverable, all to
112
+ the greatest extent permissible under applicable law.
113
+ c. Affirmer disclaims responsibility for clearing rights of other persons
114
+ that may apply to the Work or any use thereof, including without
115
+ limitation any person's Copyright and Related Rights in the Work.
116
+ Further, Affirmer disclaims responsibility for obtaining any necessary
117
+ consents, permissions or other rights required for any use of the
118
+ Work.
119
+ d. Affirmer understands and acknowledges that Creative Commons is not a
120
+ party to this document and has no duty or obligation with respect to
121
+ this CC0 or use of the Work.
NOTICE ADDED
@@ -0,0 +1 @@
 
 
1
+ violet.png is © @rose.grtqndl (Instagram). Used and redistributed with permission. Copyright remains with the artist.
README.md CHANGED
@@ -1,3 +1,160 @@
1
  ---
 
 
 
 
 
 
 
 
2
  license: cc0-1.0
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ language:
3
+ - en
4
+ library_name: transformers
5
+ tags:
6
+ - text-generation
7
+ - gpt_neox
8
+ - roleplay
9
+ - victorian
10
  license: cc0-1.0
11
  ---
12
+
13
+ # Violet 1B4 Completion
14
+
15
+ ![Violet](./violet.png)
16
+
17
+ ## Model Summary
18
+ **Violet** is a GPT-NeoX language model fine-tuned to portray **Miss Violet Hartwell**, a well-bred young lady of Kensington, London, in the year **1899**. She is trained primarily on period texts (1800–1899) and is intended to behave as if unfamiliar with modern society and events (with occasional OCR artifacts—see Known Issues). This is the completion version of the model, so if you were looking for the Chat version, you should check out [Violet 1b4 Chat](https://huggingface.co/zakarth/violet-1b4-chat)
19
+
20
+ She is intended for **creative writing**, **roleplay**, **period-appropriate correspondence**, and **Victorian etiquette**.
21
+
22
+ - Architecture: `GPTNeoXForCausalLM`
23
+ - Parameters: ~1.41B
24
+ - Context length: 4096
25
+ - Vocab size: 24014
26
+ - Tokenizer: `PreTrainedTokenizerFast`
27
+
28
+ ## Intended Use
29
+ **Good for**
30
+ - Victorian-flavored narrative completions
31
+
32
+ **Not good for**
33
+ - Contemporary factual Q&A
34
+ - Medical/legal/financial advice
35
+
36
+ ## Known Issues / Limitations
37
+ - Ages and dates can be unreliable (even within 1800–1899).
38
+ - Because parts of the corpus were derived from OCR, occasional stray modern tokens may appear (e.g., “http”, “Google”, “Internet Archive”).
39
+ - Training data includes UK and US English from the era.
40
+
41
+ ## Notes
42
+ Violet is not the first LLM trained on a historical-only pretraining corpus; to the author’s knowledge that distinction belongs to **TimeCapsuleLLM**. Violet was developed independently, and differs in:
43
+ - A structured “mood” line as part of chat output (1b4 chat and 160m chat only)
44
+ - Built-in character design and prompt protocol (1b4 chat and 160m chat only)
45
+ - A custom Victorian-era tokenizer
46
+
47
+ Violet was built on a corpus spanning 1800–1899 sourced from Project Gutenberg, the Internet Archive, the British National Library, and other archives.
48
+
49
+ This project began as an attempt to build a local LLM without relying on copyrighted training sources. The author also values local models that can run on a user’s machine without sending data to the cloud.
50
+
51
+ ## Demo Resources
52
+ - HF Space: [Transformers.js Demo](Zakarth/violetdemo)
53
+ - CloudFlare Mirror: [Transformers.js Demo](https://pub-353f427e6227415cb077f3645638c125.r2.dev/index.html)
54
+ - Both of these are intended to use WebGPU and run local on your system -- No data is sent to the cloud.
55
+
56
+ ## Related repos
57
+ - `Zakarth/violet-1b4` (base/completion)
58
+ - `Zakarth/violet-1b4-chat-onnx` (WebGPU INT8)
59
+
60
+ ## Prompt Format (Chat)
61
+ This model was trained to generate **a mood line + assistant tag + response** after `<|violet_mood|>`.
62
+
63
+ Use this structure:
64
+
65
+ ```text
66
+ The morning fog had scarcely lifted when
67
+ ```
68
+
69
+ The model will then generate:
70
+
71
+ ```text
72
+ {response...}
73
+ ```
74
+
75
+ ## Tokenization and Special Tokens
76
+ Violet 1b4 was trained on a custom tokenizer specific for Victorian text.
77
+
78
+ Recommended IDs for generation:
79
+ * eos_token_id: 0
80
+ * pad_token_id: 1
81
+
82
+ Special tokens used during training (typical IDs from training config):
83
+
84
+ * <|system|>: 24000
85
+ * <|user|>: 24001
86
+ * <|assistant|>: 24002
87
+ * <|violet_mood|>: 24005
88
+
89
+ !! Do not mix tokenizers from other Violet variants (e.g. 160M) with this model.
90
+
91
+ ## How to use (Transformers)
92
+ ```code
93
+ from transformers import AutoTokenizer, AutoModelForCausalLM
94
+ import torch
95
+
96
+ repo = "Zakarth/violet-1b4-chat"
97
+ tok = AutoTokenizer.from_pretrained(repo, use_fast=True)
98
+ model = AutoModelForCausalLM.from_pretrained(repo, device_map="auto")
99
+
100
+ prompt = "The morning fog had scarcely lifted when"""
101
+
102
+ inputs = tok(prompt, return_tensors="pt", add_special_tokens=False)
103
+ # Filter out token_type_ids if present
104
+ inputs = {k: v.to(model.device) for k, v in inputs.items() if k in ['input_ids', 'attention_mask']}
105
+
106
+ out = model.generate(
107
+ **inputs,
108
+ max_new_tokens=180,
109
+ do_sample=True,
110
+ temperature=0.8,
111
+ top_p=0.9,
112
+ top_k=40,
113
+ repetition_penalty=1.15,
114
+ eos_token_id=0,
115
+ pad_token_id=1,
116
+ )
117
+ print(tok.decode(out[0][inputs["input_ids"].shape[1]:], skip_special_tokens=False))
118
+ ```
119
+
120
+ ## Sample Outputs
121
+
122
+ > The morning fog had scarcely lifted when
123
+
124
+ ```text
125
+ The morning fog had scarcely lifted whenthe first
126
+ light streamed through the window, and before it was quite light the
127
+ flood of rain came on with a suddenness which seemed to scorch up the
128
+ roof. The lightning was as bright as ever, but there were only three or
129
+ four flashes in the sky--a bright flash like a meteor--and the thunder
130
+ was not so loud nor so deafening as usual.
131
+
132
+ At last the storm ceased. The storm was over; the stars shone out; the
133
+ thunder rolled away, leaving the clouds behind it in an impenetrable
134
+ haze, which at once became visible, and soon they disappeared. The wind
135
+ blew with fury, driving the snow and ice from off the roofs of the houses
136
+ ```
137
+
138
+ ## License
139
+
140
+ Model weights and code in this repository are released under CC0 1.0 (public domain dedication).
141
+
142
+ ## Artwork
143
+
144
+ violet.png is © @rose.grtqndl (Instagram). Used and redistributed with permission; copyright remains with the artist.
145
+
146
+ ## Contact
147
+
148
+ You may contact me on X or anywhere else by searching for my handle
149
+
150
+ ## Citation
151
+
152
+ ```bibtex
153
+ @misc{violet2026,
154
+ author = Zakarth,
155
+ title = {Violet: Victorian Language Models},
156
+ year = {2026},
157
+ publisher = {HuggingFace},
158
+ url = {https://huggingface.co/Zakarth/violet-1b4-chat}
159
+ }
160
+ ```
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "GPTNeoXForCausalLM"
4
+ ],
5
+ "attention_bias": true,
6
+ "attention_dropout": 0.0,
7
+ "bos_token_id": 0,
8
+ "classifier_dropout": 0.1,
9
+ "dtype": "bfloat16",
10
+ "eos_token_id": 0,
11
+ "hidden_act": "gelu",
12
+ "hidden_dropout": 0.0,
13
+ "hidden_size": 2048,
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 8192,
16
+ "layer_norm_eps": 1e-05,
17
+ "max_position_embeddings": 4096,
18
+ "model_type": "gpt_neox",
19
+ "num_attention_heads": 16,
20
+ "num_hidden_layers": 26,
21
+ "pad_token_id": 1,
22
+ "partial_rotary_factor": 1.0,
23
+ "rope_scaling": null,
24
+ "rope_theta": 10000,
25
+ "rotary_emb_base": 10000,
26
+ "rotary_pct": 1.0,
27
+ "tie_word_embeddings": false,
28
+ "transformers_version": "4.57.3",
29
+ "use_cache": false,
30
+ "use_parallel_residual": true,
31
+ "vocab_size": 24014
32
+ }
generation_config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 0,
4
+ "eos_token_id": [
5
+ 0,
6
+ 2
7
+ ],
8
+ "pad_token_id": 1,
9
+ "transformers_version": "4.57.3"
10
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:21e4c5287f51bf205c4784acc8c8a0fae7d6ef4214b30a98c69b3f1e54b3949b
3
+ size 2815398440
special_tokens_map.json ADDED
@@ -0,0 +1,46 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<|system|>",
4
+ "<|user|>",
5
+ "<|assistant|>",
6
+ "<|violet|>",
7
+ "<|violet_rpm|>",
8
+ "<|violet_mood|>",
9
+ "<|violet_act|>",
10
+ "<|doc|>",
11
+ "<|novel|>",
12
+ "<|letter|>",
13
+ "<|periodical|>",
14
+ "<|title|>",
15
+ "<|author|>",
16
+ "<|year|>"
17
+ ],
18
+ "bos_token": {
19
+ "content": "<|endoftext|>",
20
+ "lstrip": false,
21
+ "normalized": false,
22
+ "rstrip": false,
23
+ "single_word": false
24
+ },
25
+ "eos_token": {
26
+ "content": "<|endoftext|>",
27
+ "lstrip": false,
28
+ "normalized": false,
29
+ "rstrip": false,
30
+ "single_word": false
31
+ },
32
+ "pad_token": {
33
+ "content": "<|pad|>",
34
+ "lstrip": false,
35
+ "normalized": false,
36
+ "rstrip": false,
37
+ "single_word": false
38
+ },
39
+ "unk_token": {
40
+ "content": "<|unk|>",
41
+ "lstrip": false,
42
+ "normalized": false,
43
+ "rstrip": false,
44
+ "single_word": false
45
+ }
46
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,164 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "<|endoftext|>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "1": {
12
+ "content": "<|pad|>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "2": {
20
+ "content": "<|unk|>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "24000": {
28
+ "content": "<|system|>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "24001": {
36
+ "content": "<|user|>",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ },
43
+ "24002": {
44
+ "content": "<|assistant|>",
45
+ "lstrip": false,
46
+ "normalized": false,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": true
50
+ },
51
+ "24003": {
52
+ "content": "<|violet|>",
53
+ "lstrip": false,
54
+ "normalized": false,
55
+ "rstrip": false,
56
+ "single_word": false,
57
+ "special": true
58
+ },
59
+ "24004": {
60
+ "content": "<|violet_rpm|>",
61
+ "lstrip": false,
62
+ "normalized": false,
63
+ "rstrip": false,
64
+ "single_word": false,
65
+ "special": true
66
+ },
67
+ "24005": {
68
+ "content": "<|violet_mood|>",
69
+ "lstrip": false,
70
+ "normalized": false,
71
+ "rstrip": false,
72
+ "single_word": false,
73
+ "special": true
74
+ },
75
+ "24006": {
76
+ "content": "<|violet_act|>",
77
+ "lstrip": false,
78
+ "normalized": false,
79
+ "rstrip": false,
80
+ "single_word": false,
81
+ "special": true
82
+ },
83
+ "24007": {
84
+ "content": "<|doc|>",
85
+ "lstrip": false,
86
+ "normalized": false,
87
+ "rstrip": false,
88
+ "single_word": false,
89
+ "special": true
90
+ },
91
+ "24008": {
92
+ "content": "<|novel|>",
93
+ "lstrip": false,
94
+ "normalized": false,
95
+ "rstrip": false,
96
+ "single_word": false,
97
+ "special": true
98
+ },
99
+ "24009": {
100
+ "content": "<|letter|>",
101
+ "lstrip": false,
102
+ "normalized": false,
103
+ "rstrip": false,
104
+ "single_word": false,
105
+ "special": true
106
+ },
107
+ "24010": {
108
+ "content": "<|periodical|>",
109
+ "lstrip": false,
110
+ "normalized": false,
111
+ "rstrip": false,
112
+ "single_word": false,
113
+ "special": true
114
+ },
115
+ "24011": {
116
+ "content": "<|title|>",
117
+ "lstrip": false,
118
+ "normalized": false,
119
+ "rstrip": false,
120
+ "single_word": false,
121
+ "special": true
122
+ },
123
+ "24012": {
124
+ "content": "<|author|>",
125
+ "lstrip": false,
126
+ "normalized": false,
127
+ "rstrip": false,
128
+ "single_word": false,
129
+ "special": true
130
+ },
131
+ "24013": {
132
+ "content": "<|year|>",
133
+ "lstrip": false,
134
+ "normalized": false,
135
+ "rstrip": false,
136
+ "single_word": false,
137
+ "special": true
138
+ }
139
+ },
140
+ "additional_special_tokens": [
141
+ "<|system|>",
142
+ "<|user|>",
143
+ "<|assistant|>",
144
+ "<|violet|>",
145
+ "<|violet_rpm|>",
146
+ "<|violet_mood|>",
147
+ "<|violet_act|>",
148
+ "<|doc|>",
149
+ "<|novel|>",
150
+ "<|letter|>",
151
+ "<|periodical|>",
152
+ "<|title|>",
153
+ "<|author|>",
154
+ "<|year|>"
155
+ ],
156
+ "bos_token": "<|endoftext|>",
157
+ "clean_up_tokenization_spaces": false,
158
+ "eos_token": "<|endoftext|>",
159
+ "extra_special_tokens": {},
160
+ "model_max_length": 1000000000000000019884624838656,
161
+ "pad_token": "<|pad|>",
162
+ "tokenizer_class": "PreTrainedTokenizerFast",
163
+ "unk_token": "<|unk|>"
164
+ }
violet.png ADDED

Git LFS Details

  • SHA256: 9a1d7d864791b3965d1f16c3358e92d294582fc2a35ef7af3b5605455016cb61
  • Pointer size: 131 Bytes
  • Size of remote file: 153 kB