madmasyhur commited on
Commit
0be50d5
·
1 Parent(s): e34a0cc

Upload DreamBooth model for Indonesian batik patterns

Browse files

- Added fine-tuned Stable Diffusion v1.5 model
- Trained on 20 traditional Indonesian batik motifs
- 8000 training steps with DreamBooth technique
- Includes unet, text_encoder, tokenizer components
- Complete documentation with usage examples

README.md ADDED
@@ -0,0 +1,158 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - stable-diffusion
4
+ - dreambooth
5
+ - batik
6
+ - indonesian-culture
7
+ - text-to-image
8
+ - diffusers
9
+ base_model: sd-batik-llava
10
+ pipeline_tag: text-to-image
11
+ ---
12
+
13
+ # SD Batik DreamBooth - Indonesian Traditional Patterns
14
+
15
+ This is a fine-tuned Stable Diffusion v1.5 model trained using DreamBooth technique on 20 traditional Indonesian batik motifs. The model can generate authentic Indonesian batik patterns using specific trigger tokens for each of the 20 different motifs.
16
+
17
+ ## Model Description
18
+
19
+ This model was trained to understand and generate traditional Indonesian batik patterns, preserving the cultural heritage and artistic characteristics of each specific motif. Each batik style has been carefully learned through the DreamBooth fine-tuning process.
20
+
21
+ ## Usage
22
+
23
+ ```python
24
+ from diffusers import StableDiffusionPipeline
25
+ import torch
26
+
27
+ # Load the model
28
+ pipe = StableDiffusionPipeline.from_pretrained(
29
+ "madmasyhur/sd-batik-dreambooth",
30
+ torch_dtype=torch.float16,
31
+ safety_checker=None,
32
+ requires_safety_checker=False
33
+ )
34
+ pipe = pipe.to("cuda")
35
+
36
+ # Generate batik patterns
37
+ # use both unique token and motif name for better result
38
+ prompt = "a photo of balibtk batik bali pattern, traditional indonesian textile, high quality, detailed"
39
+ negative_prompt = "blurry, low quality, distorted, text, watermark"
40
+
41
+ image = pipe(
42
+ prompt,
43
+ negative_prompt=negative_prompt,
44
+ num_inference_steps=50,
45
+ guidance_scale=7.5,
46
+ height=512,
47
+ width=512
48
+ ).images[0]
49
+
50
+ image.save("batik_pattern.png")
51
+ ```
52
+
53
+ ## Supported Batik Motifs & Tokens
54
+
55
+ | Motif | Token | Region/Style | Description |
56
+ |-------|-------|--------------|-------------|
57
+ | Bali | `balibtk` | Balinese traditional | Traditional Balinese ornamental patterns |
58
+ | Betawi | `betbtk` | Jakarta traditional | Jakarta's indigenous cultural motifs |
59
+ | Cendrawasih | `cenbtk` | Papua bird motif | Bird of paradise inspired patterns |
60
+ | Ceplok | `cepbtk` | Central Java geometric | Geometric circular and floral motifs |
61
+ | Dayak | `daybtk` | Kalimantan traditional | Borneo indigenous tribal patterns |
62
+ | Garutan | `garbtk` | Garut regional style | West Java regional motifs |
63
+ | Gentongan | `genbtk` | Traditional dyeing technique | Classic dyeing method patterns |
64
+ | Jumputan | `jumbtk` | Tie-dye technique | Traditional tie-dye resist patterns |
65
+ | Kawung | `kawbtk` | Yogyakarta classic | Four-circle geometric royal pattern |
66
+ | Keraton | `kerbtk` | Royal court style | Noble palace traditional designs |
67
+ | Lasem | `lasbtk` | Chinese-Javanese fusion | Chinese-influenced coastal patterns |
68
+ | Megamendung | `mgmdbtk` | Cirebon cloud pattern | Stylized cloud and sky motifs |
69
+ | Nitik | `nitbtk` | Fine dot pattern | Delicate small dot decorations |
70
+ | Parang | `parbtk` | Diagonal knife pattern | Diagonal sword/blade motifs |
71
+ | Sasirangan | `sasbtk` | Banjarmasin traditional | South Kalimantan traditional patterns |
72
+ | Sekar | `sekbtk` | Floral pattern | Various flower and plant motifs |
73
+ | Sidoluhur | `sidlbtk` | Javanese noble pattern | High-status traditional design |
74
+ | Sidomukti | `sidmbtk` | Javanese prosperity pattern | Prosperity and fortune symbols |
75
+ | Sogan | `sogbtk` | Brown natural dye | Natural brown-colored traditional dye |
76
+ | Tambal | `tambtk` | Patchwork pattern | Mixed patchwork design motifs |
77
+
78
+ ## Example Prompts
79
+
80
+ ### Classic Patterns
81
+ ```
82
+ "a photo of kawbtk batik kawung pattern, traditional indonesian textile, geometric design, high quality"
83
+ "a photo of parbtk batik parang pattern, diagonal lines, traditional javanese, detailed, masterpiece"
84
+ "a photo of kerbtk batik keraton pattern, royal court style, intricate details, traditional"
85
+ ```
86
+
87
+ ### Regional Styles
88
+ ```
89
+ "a photo of mgmdbtk batik megamendung pattern, cloud motif, cirebon style, intricate details"
90
+ "a photo of cenbtk batik cendrawasih pattern, bird of paradise, papua traditional, vibrant colors"
91
+ "a photo of daybtk batik dayak pattern, tribal design, kalimantan traditional, authentic"
92
+ ```
93
+
94
+ ### Natural Motifs
95
+ ```
96
+ "a photo of sekbtk batik sekar pattern, floral design, traditional indonesian, beautiful flowers"
97
+ "a photo of sogbtk batik sogan pattern, natural brown dye, traditional coloring, organic texture"
98
+ "a photo of lasbtk batik lasem pattern, coastal design, chinese javanese fusion, elegant"
99
+ ```
100
+
101
+ ### Technique-based
102
+ ```
103
+ "a photo of jumbtk batik jumputan pattern, tie dye technique, traditional resist dyeing, colorful"
104
+ "a photo of genbtk batik gentongan pattern, traditional dyeing method, authentic technique, detailed"
105
+ "a photo of nitbtk batik nitik pattern, fine dots, delicate pattern, traditional craftsmanship"
106
+ ```
107
+
108
+ ## Training Details
109
+
110
+ - **Base Model**: izzudd/sd-batik-llava
111
+ - **Training Method**: DreamBooth
112
+ - **Training Steps**: 8000
113
+ - **Learning Rate**: 2e-6
114
+ - **Training Resolution**: 256x256
115
+ - **Inference Resolution**: 512x512
116
+ - **Dataset**: 20 traditional Indonesian batik motifs
117
+ - **Mixed Precision**: no
118
+ - **Optimizer**: AdamW
119
+
120
+ ## Model Performance
121
+
122
+ The model has been trained to maintain the authentic characteristics of each batik motif while being capable of generating variations and combinations. The training process focused on:
123
+
124
+ - Preserving traditional pattern structures
125
+ - Maintaining cultural authenticity
126
+ - Generating high-quality textile textures
127
+ - Understanding motif-specific characteristics
128
+
129
+ ## Cultural Considerations
130
+
131
+ This model represents traditional Indonesian cultural heritage. Please use it respectfully and acknowledge the cultural significance of batik patterns in Indonesian society. Batik is recognized by UNESCO as a Masterpiece of Oral and Intangible Heritage of Humanity.
132
+
133
+ ## Technical Notes
134
+
135
+ - Best results with guidance scale 7.5-12.0
136
+ - Recommended inference steps: 50-100
137
+ - Works well with negative prompts to avoid unwanted artifacts
138
+ - Can be combined with other prompts for creative variations
139
+
140
+ ## Citation
141
+
142
+ If you use this model in your research or creative work, please consider citing:
143
+
144
+ ```bibtex
145
+ @misc{sd-batik-dreambooth,
146
+ title={SD Batik DreamBooth - Indonesian Traditional Patterns},
147
+ author={madmasyhur},
148
+ year={2025},
149
+ url={https://huggingface.co/madmasyhur/sd-batik-dreambooth}
150
+ }
151
+ ```
152
+
153
+ ## Acknowledgments
154
+
155
+ - Traditional batik artisans and cultural heritage preservationists
156
+ - Indonesian Ministry of Culture for cultural documentation
157
+ - Stability AI for the base Stable Diffusion model
158
+ - HuggingFace for the diffusion pipeline infrastructure
text_encoder/config.json ADDED
@@ -0,0 +1,25 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "izzudd/sd-batik-llava",
3
+ "architectures": [
4
+ "CLIPTextModel"
5
+ ],
6
+ "attention_dropout": 0.0,
7
+ "bos_token_id": 0,
8
+ "dropout": 0.0,
9
+ "eos_token_id": 2,
10
+ "hidden_act": "quick_gelu",
11
+ "hidden_size": 768,
12
+ "initializer_factor": 1.0,
13
+ "initializer_range": 0.02,
14
+ "intermediate_size": 3072,
15
+ "layer_norm_eps": 1e-05,
16
+ "max_position_embeddings": 77,
17
+ "model_type": "clip_text_model",
18
+ "num_attention_heads": 12,
19
+ "num_hidden_layers": 12,
20
+ "pad_token_id": 1,
21
+ "projection_dim": 768,
22
+ "torch_dtype": "float32",
23
+ "transformers_version": "4.34.0",
24
+ "vocab_size": 49428
25
+ }
text_encoder/pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d5462a708645563685c1355927b858efd5b9fbdea8fc1ff877c9411823bff199
3
+ size 492370714
tokenizer/added_tokens.json ADDED
@@ -0,0 +1,24 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "<|endoftext|>": 49407,
3
+ "<|startoftext|>": 49406,
4
+ "balibtk": 49408,
5
+ "betbtk": 49409,
6
+ "cenbtk": 49410,
7
+ "cepbtk": 49411,
8
+ "daybtk": 49412,
9
+ "garbtk": 49413,
10
+ "genbtk": 49414,
11
+ "jumbtk": 49415,
12
+ "kawbtk": 49416,
13
+ "kerbtk": 49417,
14
+ "lasbtk": 49418,
15
+ "mgmdbtk": 49419,
16
+ "nitbtk": 49420,
17
+ "parbtk": 49421,
18
+ "sasbtk": 49422,
19
+ "sekbtk": 49423,
20
+ "sidlbtk": 49424,
21
+ "sidmbtk": 49425,
22
+ "sogbtk": 49426,
23
+ "tambtk": 49427
24
+ }
tokenizer/merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer/special_tokens_map.json ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": "<|startoftext|>",
3
+ "eos_token": "<|endoftext|>",
4
+ "pad_token": "<|endoftext|>",
5
+ "unk_token": "<|endoftext|>"
6
+ }
tokenizer/tokenizer_config.json ADDED
@@ -0,0 +1,192 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": false,
3
+ "added_tokens_decoder": {
4
+ "49406": {
5
+ "content": "<|startoftext|>",
6
+ "lstrip": false,
7
+ "normalized": true,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ },
12
+ "49407": {
13
+ "content": "<|endoftext|>",
14
+ "lstrip": false,
15
+ "normalized": true,
16
+ "rstrip": false,
17
+ "single_word": false,
18
+ "special": true
19
+ },
20
+ "49408": {
21
+ "content": "balibtk",
22
+ "lstrip": true,
23
+ "normalized": true,
24
+ "rstrip": true,
25
+ "single_word": false,
26
+ "special": false
27
+ },
28
+ "49409": {
29
+ "content": "betbtk",
30
+ "lstrip": true,
31
+ "normalized": true,
32
+ "rstrip": true,
33
+ "single_word": false,
34
+ "special": false
35
+ },
36
+ "49410": {
37
+ "content": "cenbtk",
38
+ "lstrip": true,
39
+ "normalized": true,
40
+ "rstrip": true,
41
+ "single_word": false,
42
+ "special": false
43
+ },
44
+ "49411": {
45
+ "content": "cepbtk",
46
+ "lstrip": true,
47
+ "normalized": true,
48
+ "rstrip": true,
49
+ "single_word": false,
50
+ "special": false
51
+ },
52
+ "49412": {
53
+ "content": "daybtk",
54
+ "lstrip": true,
55
+ "normalized": true,
56
+ "rstrip": true,
57
+ "single_word": false,
58
+ "special": false
59
+ },
60
+ "49413": {
61
+ "content": "garbtk",
62
+ "lstrip": true,
63
+ "normalized": true,
64
+ "rstrip": true,
65
+ "single_word": false,
66
+ "special": false
67
+ },
68
+ "49414": {
69
+ "content": "genbtk",
70
+ "lstrip": true,
71
+ "normalized": true,
72
+ "rstrip": true,
73
+ "single_word": false,
74
+ "special": false
75
+ },
76
+ "49415": {
77
+ "content": "jumbtk",
78
+ "lstrip": true,
79
+ "normalized": true,
80
+ "rstrip": true,
81
+ "single_word": false,
82
+ "special": false
83
+ },
84
+ "49416": {
85
+ "content": "kawbtk",
86
+ "lstrip": true,
87
+ "normalized": true,
88
+ "rstrip": true,
89
+ "single_word": false,
90
+ "special": false
91
+ },
92
+ "49417": {
93
+ "content": "kerbtk",
94
+ "lstrip": true,
95
+ "normalized": true,
96
+ "rstrip": true,
97
+ "single_word": false,
98
+ "special": false
99
+ },
100
+ "49418": {
101
+ "content": "lasbtk",
102
+ "lstrip": true,
103
+ "normalized": true,
104
+ "rstrip": true,
105
+ "single_word": false,
106
+ "special": false
107
+ },
108
+ "49419": {
109
+ "content": "mgmdbtk",
110
+ "lstrip": true,
111
+ "normalized": true,
112
+ "rstrip": true,
113
+ "single_word": false,
114
+ "special": false
115
+ },
116
+ "49420": {
117
+ "content": "nitbtk",
118
+ "lstrip": true,
119
+ "normalized": true,
120
+ "rstrip": true,
121
+ "single_word": false,
122
+ "special": false
123
+ },
124
+ "49421": {
125
+ "content": "parbtk",
126
+ "lstrip": true,
127
+ "normalized": true,
128
+ "rstrip": true,
129
+ "single_word": false,
130
+ "special": false
131
+ },
132
+ "49422": {
133
+ "content": "sasbtk",
134
+ "lstrip": true,
135
+ "normalized": true,
136
+ "rstrip": true,
137
+ "single_word": false,
138
+ "special": false
139
+ },
140
+ "49423": {
141
+ "content": "sekbtk",
142
+ "lstrip": true,
143
+ "normalized": true,
144
+ "rstrip": true,
145
+ "single_word": false,
146
+ "special": false
147
+ },
148
+ "49424": {
149
+ "content": "sidlbtk",
150
+ "lstrip": true,
151
+ "normalized": true,
152
+ "rstrip": true,
153
+ "single_word": false,
154
+ "special": false
155
+ },
156
+ "49425": {
157
+ "content": "sidmbtk",
158
+ "lstrip": true,
159
+ "normalized": true,
160
+ "rstrip": true,
161
+ "single_word": false,
162
+ "special": false
163
+ },
164
+ "49426": {
165
+ "content": "sogbtk",
166
+ "lstrip": true,
167
+ "normalized": true,
168
+ "rstrip": true,
169
+ "single_word": false,
170
+ "special": false
171
+ },
172
+ "49427": {
173
+ "content": "tambtk",
174
+ "lstrip": true,
175
+ "normalized": true,
176
+ "rstrip": true,
177
+ "single_word": false,
178
+ "special": false
179
+ }
180
+ },
181
+ "additional_special_tokens": [],
182
+ "bos_token": "<|startoftext|>",
183
+ "clean_up_tokenization_spaces": true,
184
+ "do_lower_case": true,
185
+ "eos_token": "<|endoftext|>",
186
+ "errors": "replace",
187
+ "model_max_length": 77,
188
+ "pad_token": "<|endoftext|>",
189
+ "tokenizer_class": "CLIPTokenizer",
190
+ "tokenizer_file": null,
191
+ "unk_token": "<|endoftext|>"
192
+ }
tokenizer/vocab.json ADDED
The diff for this file is too large to render. See raw diff
 
training_state.json ADDED
@@ -0,0 +1,711 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "step": 8000,
3
+ "config": {
4
+ "model_id": "izzudd/sd-batik-llava",
5
+ "revision": "main",
6
+ "learning_rate": 2e-06,
7
+ "max_train_steps": 8000,
8
+ "gradient_accumulation_steps": 4,
9
+ "mixed_precision": "no",
10
+ "prior_loss_weight": 1.0,
11
+ "orthogonality_loss_weight": 0.5,
12
+ "validation_frequency": 400,
13
+ "sample_batch_size": 2,
14
+ "early_stopping_patience": 1000,
15
+ "target_token_similarity": 0.8,
16
+ "target_cross_contamination": 0.4,
17
+ "target_inception_score": 2.2,
18
+ "target_fid_score": 120.0,
19
+ "motifs": [
20
+ "bali",
21
+ "betawi",
22
+ "cendrawasih",
23
+ "ceplok",
24
+ "dayak",
25
+ "garutan",
26
+ "gentongan",
27
+ "jumputan",
28
+ "kawung",
29
+ "keraton",
30
+ "lasem",
31
+ "megamendung",
32
+ "nitik",
33
+ "parang",
34
+ "sasirangan",
35
+ "sekar",
36
+ "sidoluhur",
37
+ "sidomukti",
38
+ "sogan",
39
+ "tambal"
40
+ ],
41
+ "tokens": [
42
+ "balibtk",
43
+ "betbtk",
44
+ "cenbtk",
45
+ "cepbtk",
46
+ "daybtk",
47
+ "garbtk",
48
+ "genbtk",
49
+ "jumbtk",
50
+ "kawbtk",
51
+ "kerbtk",
52
+ "lasbtk",
53
+ "mgmdbtk",
54
+ "nitbtk",
55
+ "parbtk",
56
+ "sasbtk",
57
+ "sekbtk",
58
+ "sidlbtk",
59
+ "sidmbtk",
60
+ "sogbtk",
61
+ "tambtk"
62
+ ],
63
+ "resolution": 256,
64
+ "center_crop": true,
65
+ "data_root": "./data",
66
+ "output_dir": "./output/train_20motif",
67
+ "checkpoint_dir": "./checkpoints/train_20motif",
68
+ "log_dir": "./logs/train_20motif",
69
+ "num_class_images": 200,
70
+ "prior_generation_precision": "fp16"
71
+ },
72
+ "random_state": [
73
+ 3,
74
+ [
75
+ 2147483648,
76
+ 3564348608,
77
+ 1266698288,
78
+ 4212342371,
79
+ 3595291661,
80
+ 3180588708,
81
+ 3037210256,
82
+ 946923017,
83
+ 2565409715,
84
+ 2900535780,
85
+ 924383152,
86
+ 4180157270,
87
+ 4230508198,
88
+ 2039675917,
89
+ 3755350407,
90
+ 2362848650,
91
+ 2818100609,
92
+ 2097423432,
93
+ 524478045,
94
+ 540883378,
95
+ 281170210,
96
+ 1485176884,
97
+ 1493190386,
98
+ 1773214509,
99
+ 380915208,
100
+ 3667698522,
101
+ 2648371337,
102
+ 2961234806,
103
+ 3857480267,
104
+ 1582950522,
105
+ 246289694,
106
+ 3322185604,
107
+ 1944574775,
108
+ 302623699,
109
+ 169865066,
110
+ 1143540808,
111
+ 3733177770,
112
+ 513116636,
113
+ 1411153081,
114
+ 3205493053,
115
+ 768926902,
116
+ 549624109,
117
+ 1470655403,
118
+ 59539609,
119
+ 3678480009,
120
+ 3087139671,
121
+ 1176835859,
122
+ 2078491503,
123
+ 2299934332,
124
+ 1592059249,
125
+ 1062716176,
126
+ 2654193596,
127
+ 3531838733,
128
+ 2661260596,
129
+ 3881209635,
130
+ 2106865768,
131
+ 4154287292,
132
+ 2082185616,
133
+ 2301197011,
134
+ 2177349827,
135
+ 3082181756,
136
+ 1787663536,
137
+ 3714670796,
138
+ 3018262113,
139
+ 1670056238,
140
+ 1856738750,
141
+ 99824592,
142
+ 2279837081,
143
+ 1414647942,
144
+ 3416675731,
145
+ 3458782472,
146
+ 3997022236,
147
+ 468762002,
148
+ 2666158583,
149
+ 953353270,
150
+ 1788980658,
151
+ 3802061067,
152
+ 407586584,
153
+ 1844776834,
154
+ 1906917274,
155
+ 3154715663,
156
+ 3028370222,
157
+ 4156024188,
158
+ 3996363428,
159
+ 80495456,
160
+ 2659800972,
161
+ 2005649973,
162
+ 3818358673,
163
+ 3952623596,
164
+ 2506862371,
165
+ 3282302532,
166
+ 263923435,
167
+ 3384662671,
168
+ 3292439172,
169
+ 3119957588,
170
+ 1224426111,
171
+ 899864150,
172
+ 215262826,
173
+ 1619647231,
174
+ 3347694949,
175
+ 3497868538,
176
+ 2029552053,
177
+ 2992804824,
178
+ 4080010250,
179
+ 2023513186,
180
+ 1885979437,
181
+ 3564622190,
182
+ 3775424270,
183
+ 2297810139,
184
+ 3549449169,
185
+ 2664856277,
186
+ 3274801974,
187
+ 2794883969,
188
+ 980412666,
189
+ 2980215653,
190
+ 2794389321,
191
+ 2816521934,
192
+ 1266970739,
193
+ 542306338,
194
+ 3646225311,
195
+ 3598997630,
196
+ 2111980720,
197
+ 2949252482,
198
+ 2489027658,
199
+ 352815024,
200
+ 11610683,
201
+ 1386663624,
202
+ 2004196796,
203
+ 1161461546,
204
+ 1921293780,
205
+ 2463949525,
206
+ 1647009713,
207
+ 3550093655,
208
+ 2563894064,
209
+ 3486310554,
210
+ 1506105865,
211
+ 243092931,
212
+ 2659437476,
213
+ 4200687059,
214
+ 2284345122,
215
+ 1974438610,
216
+ 3591096528,
217
+ 967119212,
218
+ 3362401375,
219
+ 140678365,
220
+ 311602112,
221
+ 2361740275,
222
+ 2139598582,
223
+ 3632873481,
224
+ 2762232439,
225
+ 4156482318,
226
+ 381637792,
227
+ 3253346525,
228
+ 2492118775,
229
+ 1502434558,
230
+ 3164497290,
231
+ 3550998357,
232
+ 2412448305,
233
+ 2223955385,
234
+ 4122879535,
235
+ 350121793,
236
+ 1835149778,
237
+ 2175117867,
238
+ 989674750,
239
+ 3178241202,
240
+ 3553093569,
241
+ 3470650311,
242
+ 2829698151,
243
+ 3209427769,
244
+ 1779174943,
245
+ 275388428,
246
+ 4044574515,
247
+ 715447260,
248
+ 3180940440,
249
+ 4020772289,
250
+ 1322708567,
251
+ 3189868792,
252
+ 4250485633,
253
+ 716970023,
254
+ 2307550151,
255
+ 1074996711,
256
+ 1217573599,
257
+ 197006094,
258
+ 2178394212,
259
+ 1255233746,
260
+ 4164251484,
261
+ 1405608772,
262
+ 2808160475,
263
+ 1304736088,
264
+ 1796071066,
265
+ 2761748078,
266
+ 3570739698,
267
+ 1616118556,
268
+ 2232868135,
269
+ 3567541936,
270
+ 3470600401,
271
+ 3031621994,
272
+ 3351764214,
273
+ 1359785149,
274
+ 2617497797,
275
+ 3340028190,
276
+ 356162828,
277
+ 2083806068,
278
+ 2503635608,
279
+ 4024838996,
280
+ 2577080371,
281
+ 2897993505,
282
+ 3120733934,
283
+ 905794891,
284
+ 2506078507,
285
+ 4211618666,
286
+ 3777871979,
287
+ 809751414,
288
+ 4080874167,
289
+ 1562977008,
290
+ 3917373055,
291
+ 2132779194,
292
+ 4014249473,
293
+ 4067327082,
294
+ 2582869847,
295
+ 1780081876,
296
+ 1842619106,
297
+ 3381761227,
298
+ 921004274,
299
+ 1393256920,
300
+ 1883566732,
301
+ 2702071861,
302
+ 865327389,
303
+ 1622085203,
304
+ 3021825820,
305
+ 2687061406,
306
+ 1748902923,
307
+ 689023977,
308
+ 308399650,
309
+ 2377287978,
310
+ 1646969411,
311
+ 1051806316,
312
+ 4277884230,
313
+ 2041056290,
314
+ 101134519,
315
+ 2032472116,
316
+ 4112521069,
317
+ 151202901,
318
+ 2773743461,
319
+ 551348559,
320
+ 3476836808,
321
+ 510935951,
322
+ 625057077,
323
+ 3757450756,
324
+ 2977698135,
325
+ 3027776859,
326
+ 2616998041,
327
+ 2773430005,
328
+ 544190486,
329
+ 2241368212,
330
+ 1141105829,
331
+ 1452816309,
332
+ 4199229235,
333
+ 3218013033,
334
+ 4229475816,
335
+ 1659576351,
336
+ 3020348754,
337
+ 1193400518,
338
+ 3208584597,
339
+ 1151197733,
340
+ 2597187966,
341
+ 503065140,
342
+ 2421841572,
343
+ 1437291709,
344
+ 1909275895,
345
+ 2872630545,
346
+ 793588217,
347
+ 3792934707,
348
+ 1784451785,
349
+ 2921385648,
350
+ 1669902526,
351
+ 4189978976,
352
+ 1196986251,
353
+ 434805516,
354
+ 1907541826,
355
+ 2624415034,
356
+ 1687778718,
357
+ 650746582,
358
+ 1949153382,
359
+ 4148493093,
360
+ 841300520,
361
+ 1164202054,
362
+ 4203468658,
363
+ 4106300911,
364
+ 850346789,
365
+ 1715730760,
366
+ 3114661489,
367
+ 2866524548,
368
+ 1360448945,
369
+ 3601318775,
370
+ 1743078223,
371
+ 2413855408,
372
+ 1211895622,
373
+ 325117146,
374
+ 2721152875,
375
+ 1284334485,
376
+ 2446538832,
377
+ 739014618,
378
+ 2237045115,
379
+ 842553465,
380
+ 2538598293,
381
+ 746460793,
382
+ 4010387366,
383
+ 2002655192,
384
+ 4193733112,
385
+ 1194380773,
386
+ 3918217378,
387
+ 1447487475,
388
+ 5659228,
389
+ 3408847694,
390
+ 4190318700,
391
+ 1862549564,
392
+ 781683719,
393
+ 1194618118,
394
+ 755053413,
395
+ 3436011942,
396
+ 2885435303,
397
+ 3081151348,
398
+ 2017642831,
399
+ 1053816502,
400
+ 1086627485,
401
+ 2157296554,
402
+ 110650022,
403
+ 965352898,
404
+ 1003174194,
405
+ 1288956241,
406
+ 4057404871,
407
+ 2965068465,
408
+ 2897064481,
409
+ 2457377317,
410
+ 1879872545,
411
+ 358455290,
412
+ 375086701,
413
+ 3015902095,
414
+ 1676249984,
415
+ 924455526,
416
+ 2084169389,
417
+ 1989014644,
418
+ 1993749926,
419
+ 2009424973,
420
+ 2113340508,
421
+ 3980883273,
422
+ 2915977458,
423
+ 203328382,
424
+ 3020815229,
425
+ 2415050113,
426
+ 4103009585,
427
+ 3700885489,
428
+ 2916647550,
429
+ 1523006503,
430
+ 174302338,
431
+ 2476909338,
432
+ 1969322490,
433
+ 4285741984,
434
+ 1528449097,
435
+ 3355315515,
436
+ 4217241278,
437
+ 599579127,
438
+ 2572243673,
439
+ 3035856735,
440
+ 1539140489,
441
+ 1782314913,
442
+ 4238644287,
443
+ 1746424142,
444
+ 1978148312,
445
+ 2380746849,
446
+ 184941882,
447
+ 1106717981,
448
+ 1720750349,
449
+ 981701307,
450
+ 3953154731,
451
+ 3257809181,
452
+ 2892339376,
453
+ 3339778166,
454
+ 3676936849,
455
+ 87425948,
456
+ 3029257381,
457
+ 2037942523,
458
+ 3807628706,
459
+ 2861474706,
460
+ 1058852346,
461
+ 1322765211,
462
+ 2686046342,
463
+ 2689342655,
464
+ 2303436168,
465
+ 2571627181,
466
+ 1986057734,
467
+ 1183564308,
468
+ 2829677523,
469
+ 1295563975,
470
+ 503126586,
471
+ 2025890348,
472
+ 4179277821,
473
+ 1735262467,
474
+ 981331774,
475
+ 1613447066,
476
+ 1011606109,
477
+ 2000062246,
478
+ 3581448390,
479
+ 3477731384,
480
+ 3641307373,
481
+ 3508544379,
482
+ 2327233491,
483
+ 3931944343,
484
+ 4189052882,
485
+ 2990416380,
486
+ 422406169,
487
+ 202291313,
488
+ 2531006461,
489
+ 4277024116,
490
+ 3815144003,
491
+ 821314585,
492
+ 1344175168,
493
+ 3562834071,
494
+ 1339615445,
495
+ 1831545190,
496
+ 3115548822,
497
+ 743512780,
498
+ 4006999448,
499
+ 3720181735,
500
+ 1012033521,
501
+ 919931041,
502
+ 2628967879,
503
+ 1151876565,
504
+ 1268107129,
505
+ 3674829936,
506
+ 834977846,
507
+ 743987006,
508
+ 3947536548,
509
+ 3706529695,
510
+ 4121073678,
511
+ 2507605742,
512
+ 1595636918,
513
+ 2708047833,
514
+ 2427507331,
515
+ 3868216331,
516
+ 3254240010,
517
+ 2097683411,
518
+ 3279710596,
519
+ 3686819053,
520
+ 1843541720,
521
+ 1683793619,
522
+ 3245287285,
523
+ 3571828776,
524
+ 3733296431,
525
+ 3806747478,
526
+ 1390930605,
527
+ 3860422228,
528
+ 114397037,
529
+ 1931519825,
530
+ 2770684378,
531
+ 1556101783,
532
+ 1436111731,
533
+ 4031950081,
534
+ 562876656,
535
+ 1775895782,
536
+ 612364620,
537
+ 1313509772,
538
+ 4283410242,
539
+ 3252958463,
540
+ 2176555836,
541
+ 3933073367,
542
+ 3013277102,
543
+ 1444071961,
544
+ 3120949516,
545
+ 2824578890,
546
+ 325676929,
547
+ 943677134,
548
+ 1800649256,
549
+ 1721927060,
550
+ 347498719,
551
+ 1435221321,
552
+ 2623572981,
553
+ 1408548470,
554
+ 4145586315,
555
+ 2901889237,
556
+ 1849377952,
557
+ 1239144551,
558
+ 3382598266,
559
+ 2992893897,
560
+ 3738297588,
561
+ 611280106,
562
+ 3897415338,
563
+ 2370299241,
564
+ 1772308583,
565
+ 3697465753,
566
+ 354508058,
567
+ 2702360134,
568
+ 591308331,
569
+ 3524072501,
570
+ 976616000,
571
+ 2563717192,
572
+ 3078266097,
573
+ 1376594703,
574
+ 4209795919,
575
+ 2454412767,
576
+ 2712206031,
577
+ 2963860163,
578
+ 3734324882,
579
+ 2248653800,
580
+ 324872786,
581
+ 3789837448,
582
+ 3779000146,
583
+ 527733939,
584
+ 2844165793,
585
+ 576499681,
586
+ 1618787435,
587
+ 2638888650,
588
+ 57511068,
589
+ 2804627518,
590
+ 2993670030,
591
+ 481402236,
592
+ 2810124845,
593
+ 1416045214,
594
+ 1723694191,
595
+ 1214944572,
596
+ 3188123783,
597
+ 1139185907,
598
+ 3851015362,
599
+ 1719652470,
600
+ 1661343029,
601
+ 3644307578,
602
+ 3564178709,
603
+ 1256656955,
604
+ 46631590,
605
+ 4231317929,
606
+ 3098958589,
607
+ 1834956625,
608
+ 2206185428,
609
+ 3695688374,
610
+ 3647957317,
611
+ 1064098871,
612
+ 1739100906,
613
+ 2579568980,
614
+ 27974051,
615
+ 2617466775,
616
+ 964075233,
617
+ 907049942,
618
+ 4164146575,
619
+ 3377168066,
620
+ 2524828266,
621
+ 1083546008,
622
+ 2992960953,
623
+ 2260789066,
624
+ 1543742095,
625
+ 2843842831,
626
+ 1375722284,
627
+ 3574521313,
628
+ 110842534,
629
+ 2310998251,
630
+ 3076511734,
631
+ 783145600,
632
+ 1287776608,
633
+ 3087144146,
634
+ 305559823,
635
+ 2356293719,
636
+ 3228441476,
637
+ 1678938122,
638
+ 3775814061,
639
+ 1620283952,
640
+ 2512027726,
641
+ 1031432407,
642
+ 962295099,
643
+ 3877418501,
644
+ 968669928,
645
+ 304126693,
646
+ 3711291137,
647
+ 3847527101,
648
+ 494066767,
649
+ 4050229756,
650
+ 4169448589,
651
+ 671763915,
652
+ 1095747781,
653
+ 4006132710,
654
+ 394725957,
655
+ 200521654,
656
+ 2715998750,
657
+ 1477567673,
658
+ 895171901,
659
+ 3370105999,
660
+ 2684157455,
661
+ 4153990023,
662
+ 3966076501,
663
+ 2043374409,
664
+ 144443759,
665
+ 6764556,
666
+ 1611650045,
667
+ 1480956755,
668
+ 1388276468,
669
+ 4136518438,
670
+ 1538041336,
671
+ 266773992,
672
+ 1623357516,
673
+ 2267298390,
674
+ 3183919402,
675
+ 1084292424,
676
+ 2796136160,
677
+ 2413448816,
678
+ 2850375199,
679
+ 3510894040,
680
+ 2644778623,
681
+ 3317288284,
682
+ 3697317540,
683
+ 1465776787,
684
+ 1843489446,
685
+ 1416711171,
686
+ 744701117,
687
+ 1286781349,
688
+ 3748640476,
689
+ 861982119,
690
+ 2377742909,
691
+ 1171768136,
692
+ 2701877439,
693
+ 3839724288,
694
+ 2869791015,
695
+ 2386067954,
696
+ 2629214347,
697
+ 955801623,
698
+ 3831079317,
699
+ 624
700
+ ],
701
+ null
702
+ ],
703
+ "numpy_state": [
704
+ "MT19937",
705
+ "[ 42 3107752595 1895908407 3900362577 3030691166 4081230161\n 2732361568 1361238961 3961642104 867618704 2837705690 3281374275\n 3928479052 3691474744 3088217429 1769265762 3769508895 2731227933\n 2930436685 486258750 1452990090 3321835500 3520974945 2343938241\n 928051207 2811458012 3391994544 3688461242 1372039449 3706424981\n 1717012300 1728812672 1688496645 1203107765 1648758310 440890502\n 1396092674 626042708 3853121610 669844980 2992565612 310741647\n 3820958101 3474052697 305511342 2053450195 705225224 3836704087\n 3293527636 1140926340 2738734251 574359520 1493564308 269614846\n 427919468 2903547603 2957214125 181522756 4137743374 2557886044\n 3399018834 1348953650 1575066973 3837612427 705360616 4138204617\n 1604205300 1605197804 590851525 2371419134 2530821810 4183626679\n 2872056396 3895467791 1156426758 184917518 2502875602 2730245981\n 3251099593 2228829441 2591075711 3048691618 3030004338 1726207619\n 993866654 823585707 936803789 3180156728 1191670842 348221088\n 988038522 3281236861 1153842962 4152167900 98291801 816305276\n 575746380 1719541597 2584648622 1791391551 3234806234 413529090\n 219961136 4180088407 1135264652 3923811338 2304598263 762142228\n 1980420688 1225347938 3657621885 3762382117 1157119598 2556627260\n 2276905960 3857700293 1903185298 4258743924 2078637161 4160077183\n 3569294948 2138906140 1346725611 1473959117 2798330104 3785346335\n 4103334026 3448442764 1142532843 4278036691 3071994514 3474299731\n 1121195796 1536841934 2132070705 1064908919 2840327803 992870214\n 2041326888 2906112696 4182466030 1031463950 703166484 854266995\n 4157971695 4071962029 2600094776 2770410869 3776335751 2599879593\n 2451043853 2223709058 2098813464 4008111478 2959232195 3072496064\n 2498909222 4020139729 785990520 958060279 4183949075 2392404465\n 533774465 4092066952 3967420027 1726137853 2907699474 3158758391\n 1460845905 1323598137 2446717890 3004885867 3447263769 1378488047\n 3172418196 652839901 1695052769 226007057 778836071 1216725078\n 655651335 1850195064 427367795 800074262 2241880422 1713434925\n 339981078 1730571881 672610244 1952245009 2729177102 3516932475\n 4032720152 3177283432 411893652 2440235559 3587427933 43170267\n 39225133 3904203400 1935961247 3843123487 1625453782 1337993374\n 2095455879 3402219947 634671126 70868861 3072823841 851862432\n 1828056818 2794213810 1222863684 2164539406 4249334162 1380362252\n 1512719097 2773165233 4063118969 3041859837 529421431 563872464\n 2478730478 3168749051 4132953373 3922807735 1124217574 1970058502\n 1744120743 1906315107 1074758800 1611130652 2878846041 886823888\n 1175456250 1669874674 2428820171 1044308794 3841962192 138850094\n 1239727126 1753711876 2194286827 872797664 4276240980 690338888\n 4087206238 2279169960 1117436170 3344885072 3127829945 315537090\n 3802787206 4157203318 1637047079 3774106877 3230158646 1855823338\n 1931415993 667252379 4288528171 1587598285 1096793218 1916566454\n 101891899 2354644560 3351208292 1467125166 2177732119 4122299478\n 3904084887 2653591155 4201043109 2867379343 2660555187 3641744616\n 4126452939 326579197 2697259239 3365236848 3007834487 4118919490\n 3306741951 2285455175 1956645973 1879691841 891565150 1843460149\n 2013381028 819311674 123282948 1436558519 1154343666 206804484\n 1650349242 2142011886 304163699 2608574600 2500624796 2996744833\n 2344192475 3152512202 165571606 691170269 1806226529 568535825\n 1243813863 3068953841 3843784723 1540495237 4246006858 1303595780\n 3288680241 864868851 819595545 3230857496 3574119395 1545404573\n 2970139338 4292786727 1803072884 1374565738 1736333177 1978645403\n 3962597126 1068006206 3458125500 168085922 1597587506 2052497512\n 1323596727 2421372441 1468386547 3574947527 3363915938 860279252\n 1309097460 3065417722 1490716202 3476091722 1669402145 895071221\n 1432690175 3353592973 149850974 2789493615 826939483 666980418\n 755367270 3988951195 21783894 1924727373 1699517788 1152431122\n 2593798113 3522529522 2797535609 4018366956 2350035889 3010507270\n 2832621820 627979167 997422629 365587204 2302500352 1720920631\n 689999548 3713985947 3267499624 1971264680 1981530399 1662926921\n 1833821660 1422522022 3141447769 2727954526 4172728772 1787436028\n 1902276939 3145551277 4207627911 2497093521 4111966589 3929089589\n 2253454030 1069424637 2165048659 2848813944 2435898022 2546206777\n 3864777677 3107311565 3776562483 1040285049 3171631943 2404677828\n 2522848682 2930777301 2831905121 1436989598 602730315 664177960\n 3959954010 3116042160 2881899726 233404945 4058465099 1781994751\n 485046222 2776777695 432082123 1989128370 86344507 2510576356\n 2194076764 1742125237 3715839140 895100548 147445686 705462897\n 2245325113 1052295404 1956014786 2916055958 1829369612 2541711050\n 1594343058 3708804266 150438233 323857098 294681952 783931535\n 606075163 2427042904 121207604 3943199031 1196785464 1818211378\n 1788241109 3138862427 2037307093 2306750301 1644605749 165986111\n 542190743 486828112 1757411662 894543082 4108143634 1232805238\n 3801632949 3863166865 713767006 2091486427 3174776264 1157004409\n 623072544 1667151721 3361539538 696723008 3247069452 682044344\n 1382136166 1385645682 4219951151 2747881261 2489355869 786564174\n 2040230554 2967874556 1414286092 2677969656 1393412218 2216095072\n 935533444 3662643439 3285199608 3103672804 522796956 3952383595\n 1928659176 3397717710 4278554051 1984736931 3559102926 1878353094\n 875578217 2398931796 2313634006 1606027661 2790634022 2334166559\n 1857067101 666458681 1626872683 2155121857 715449823 1865157100\n 2938814835 4084911240 45488075 3474982924 1750873825 2246019159\n 125388929 1110287838 652200437 4212247716 2702974687 2963764270\n 208692058 3170393729 1378248367 752591527 591629541 2253399388\n 2402291226 3089656189 3202324513 3818308310 2828131601 2690672008\n 3676629884 1007739430 4072247562 3574795162 518485611 1889402182\n 3687902739 3410263649 2790674620 779455241 3573984673 3053204735\n 4089925351 789980683 476440431 3843536868 2400661309 3139919094\n 1643266656 113318754 428163528 2386492935 3807242009 574560611\n 3174039857 3774465602 1164640969 455942925 1374407495 2562304709\n 1024844203 521375136 417432138 1203241821 2900988280 2841030991\n 2301700751 369508560 2396447808 1891459643 4225682708 3930667846\n 1518293357 2697063889 3113075061 2411136298 2836361984 4105335811\n 914081338 2675982621 1816939127 1596754123 1464603632 1598478676\n 1318403529 4016663081 2106416852 2757323084 2042842122 1175184796\n 2212339255 1334626864 3994484893 3938045599 2166620630 3036360431\n 397499085 975931950 1868702836 3530424696 3532548823 2770836469\n 3537418693 3344319345 3208552526 1771170897 4097379814 3761572528\n 2794194423 706836738 2953105956 3446096217 220984542 309619699\n 223913021 3985142640 1757616575 2582763607 4018329835 1393278443\n 4121569718 2087146446 4282833425 807775617 1396604749 3571181413\n 90301352 2618014643 2783561793 1329389532 836540831 26719530]",
706
+ 624,
707
+ 0,
708
+ 0.0
709
+ ],
710
+ "torch_state": "tensor([42, 0, 0, ..., 0, 0, 0], dtype=torch.uint8)"
711
+ }
unet/config.json ADDED
@@ -0,0 +1,68 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_class_name": "UNet2DConditionModel",
3
+ "_diffusers_version": "0.21.4",
4
+ "_name_or_path": "izzudd/sd-batik-llava",
5
+ "act_fn": "silu",
6
+ "addition_embed_type": null,
7
+ "addition_embed_type_num_heads": 64,
8
+ "addition_time_embed_dim": null,
9
+ "attention_head_dim": 8,
10
+ "attention_type": "default",
11
+ "block_out_channels": [
12
+ 320,
13
+ 640,
14
+ 1280,
15
+ 1280
16
+ ],
17
+ "center_input_sample": false,
18
+ "class_embed_type": null,
19
+ "class_embeddings_concat": false,
20
+ "conv_in_kernel": 3,
21
+ "conv_out_kernel": 3,
22
+ "cross_attention_dim": 768,
23
+ "cross_attention_norm": null,
24
+ "down_block_types": [
25
+ "CrossAttnDownBlock2D",
26
+ "CrossAttnDownBlock2D",
27
+ "CrossAttnDownBlock2D",
28
+ "DownBlock2D"
29
+ ],
30
+ "downsample_padding": 1,
31
+ "dropout": 0.0,
32
+ "dual_cross_attention": false,
33
+ "encoder_hid_dim": null,
34
+ "encoder_hid_dim_type": null,
35
+ "flip_sin_to_cos": true,
36
+ "freq_shift": 0,
37
+ "in_channels": 4,
38
+ "layers_per_block": 2,
39
+ "mid_block_only_cross_attention": null,
40
+ "mid_block_scale_factor": 1,
41
+ "mid_block_type": "UNetMidBlock2DCrossAttn",
42
+ "norm_eps": 1e-05,
43
+ "norm_num_groups": 32,
44
+ "num_attention_heads": null,
45
+ "num_class_embeds": null,
46
+ "only_cross_attention": false,
47
+ "out_channels": 4,
48
+ "projection_class_embeddings_input_dim": null,
49
+ "resnet_out_scale_factor": 1.0,
50
+ "resnet_skip_time_act": false,
51
+ "resnet_time_scale_shift": "default",
52
+ "reverse_transformer_layers_per_block": null,
53
+ "sample_size": 64,
54
+ "time_cond_proj_dim": null,
55
+ "time_embedding_act_fn": null,
56
+ "time_embedding_dim": null,
57
+ "time_embedding_type": "positional",
58
+ "timestep_post_act": null,
59
+ "transformer_layers_per_block": 1,
60
+ "up_block_types": [
61
+ "UpBlock2D",
62
+ "CrossAttnUpBlock2D",
63
+ "CrossAttnUpBlock2D",
64
+ "CrossAttnUpBlock2D"
65
+ ],
66
+ "upcast_attention": false,
67
+ "use_linear_projection": false
68
+ }
unet/diffusion_pytorch_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:35058ce48896aeb31078417c043b886f27d02142e76a73e5dbe71e1681cfaa23
3
+ size 3438167536