crystantine commited on
Commit
cb1a45b
·
1 Parent(s): 30c16af

Upload 37 files

Browse files
.gitattributes CHANGED
@@ -53,3 +53,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
53
  *.jpg filter=lfs diff=lfs merge=lfs -text
54
  *.jpeg filter=lfs diff=lfs merge=lfs -text
55
  *.webp filter=lfs diff=lfs merge=lfs -text
 
 
53
  *.jpg filter=lfs diff=lfs merge=lfs -text
54
  *.jpeg filter=lfs diff=lfs merge=lfs -text
55
  *.webp filter=lfs diff=lfs merge=lfs -text
56
+ fonts/CALIBRI.TTF filter=lfs diff=lfs merge=lfs -text
.gitignore ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ __pycache__
2
+ Config.json
README.md ADDED
@@ -0,0 +1,48 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # ComfyUI-extra-nodes - quality of life
2
+ Extra nodes to be used in ComfyUI, including a new ChatGPT node for generating natural language responses.
3
+
4
+ ## ComfyUI
5
+ ComfyUI is an advanced node-based UI that utilizes Stable Diffusion, allowing you to create customized workflows such as image post-processing or conversions.
6
+
7
+ ## How to install
8
+ Download the zip file.
9
+ Extract to ..\ComfyUI\custom_nodes.
10
+ Restart ComfyUI if it was running (reloading the web is not enough).
11
+ You will find my nodes under the new group O/....
12
+
13
+ ## How to update
14
+ - quality of life will auto update each time you run comfyUI
15
+ - if you want to stop autoUpdate edit __init__.py and set autoUpdate = false
16
+
17
+ ## Current nodes
18
+ ## ChatGPT
19
+ This node harnesses the power of chatGPT, an advanced language model that can generate detailed image descriptions from a small input.
20
+ - you need to have OpenAI API key , which you can find at https://beta.openai.com/docs/developer-apis/overview
21
+ - Once you have your API key, add it to the api_key.txt file
22
+ - I have made it a separate file, so that the API key doesn't get embedded in the generated images.
23
+
24
+ ## String Suit
25
+ This set of nodes adds support for string manipulation and includes a tool to generate an image from text.
26
+
27
+ - String: A node that can hold a string (text).
28
+ - Debug String: This node writes the string to the console.
29
+ - Concat String: This node combines two strings together.
30
+ - Trim String: This node removes any extra spaces at the start or end of a string.
31
+ - Replace String & Replace String Advanced: These nodes replace part of the text with another part.
32
+ ### String2image
33
+ This node generates an image based on text, which can be used with ControlNet to add text to the image. The tool supports various fonts; you can add the font you want in the fonts folder. If you load the example image in ComfyUI, the workflow that generated it will be loaded.
34
+
35
+ ### CLIPStringEncode
36
+ This node is a variant of the ClipTextEncode node, but it receives the text from the string node, so you don't have to retype your prompt twice anymore.
37
+
38
+ ## Other tools
39
+ ### LatentUpscaleMultiply
40
+ This node is a variant of the original LatentUpscale tool, but instead of using width and height, you use a multiply number. For example, if the original image dimensions are (512,512) and the mul values are (2,2), the result image will be (1024,1024). You can also use it to downscale by using fractions, e.g., (512,512) mul (.5,.5) → (256,256).
41
+
42
+ Node Path: O/Latent/LatentUpscaleMultiply
43
+
44
+ ## Thanks for reading my message, and I hope that my tools will help you.
45
+
46
+ ## Contact
47
+ ### Discord: Omar92#3374
48
+ ### GitHub: omar92 (https://github.com/omar92)
Workflows/AdvancedReplace.png ADDED

Git LFS Details

  • SHA256: fd076d47542634052f3a6f0f0929bb9a55c8b0e58e0525be5dc5c59020ec0891
  • Pointer size: 130 Bytes
  • Size of remote file: 75.5 kB
Workflows/ChatGPT.png ADDED

Git LFS Details

  • SHA256: 3fe0f6ef477262761a24a3ecc25183bd2a0b26eb095fe66e300712a6a9cf6e26
  • Pointer size: 131 Bytes
  • Size of remote file: 429 kB
Workflows/ChatGPT_Advanced.png ADDED

Git LFS Details

  • SHA256: 55da065e97e6ddf618af494cfaa2be42ea91064f3763cac7aed185d8e9ec4cf2
  • Pointer size: 131 Bytes
  • Size of remote file: 256 kB
Workflows/ComfyUI_00471_.png ADDED

Git LFS Details

  • SHA256: 61f5c37d886d5acd789085d4d0f22181b20c37c55e0e70b6bb142830ba41fdb2
  • Pointer size: 131 Bytes
  • Size of remote file: 512 kB
Workflows/ComfyUI_01051_.png ADDED

Git LFS Details

  • SHA256: aca80d400e0e3e82261c04f6f9d8f9cb864bf251ef631b7afe3e58ba3fef0fc4
  • Pointer size: 132 Bytes
  • Size of remote file: 1.11 MB
Workflows/ComfyUI_01176_.png ADDED

Git LFS Details

  • SHA256: 41f7caf6ca360298340e1cfd265a94cec2bcc586418fa0941200c87978e508ee
  • Pointer size: 132 Bytes
  • Size of remote file: 1.44 MB
Workflows/ComfyUI_01418_.png ADDED

Git LFS Details

  • SHA256: 42d234045cebd557296ed7053ad551b580b896d8f54734e95303ad47a72d972e
  • Pointer size: 131 Bytes
  • Size of remote file: 412 kB
Workflows/ComfyUI_02496_.png ADDED

Git LFS Details

  • SHA256: c73e38ad3dda4f8c37f899d0d72677aa9098e65ca21e922f5343bd48d7aeb7c3
  • Pointer size: 131 Bytes
  • Size of remote file: 412 kB
Workflows/OpenAIAdvanced.png ADDED

Git LFS Details

  • SHA256: b48b784886f530ddf3348acfbca9b176a921db5a638a11da0d67b28a86c3ba23
  • Pointer size: 131 Bytes
  • Size of remote file: 408 kB
Workflows/String manipulation.png ADDED

Git LFS Details

  • SHA256: 23583f018dcc20143019e1905506d4c28a234811558a7e62ddc2052d28409313
  • Pointer size: 130 Bytes
  • Size of remote file: 82.8 kB
Workflows/Text.png ADDED

Git LFS Details

  • SHA256: def3db0d15fee8bc1f1eafccbb3a5881143914a45895fc807b5c1eb91cf6065c
  • Pointer size: 131 Bytes
  • Size of remote file: 154 kB
Workflows/in2wid.png ADDED

Git LFS Details

  • SHA256: e811ef6f79aac4d5d6c2d98d3016b7b906dc3c8263b448147e4a94078a5e17ce
  • Pointer size: 130 Bytes
  • Size of remote file: 56.5 kB
Workflows/latentUpscaleMultiply.png ADDED

Git LFS Details

  • SHA256: 320ef73f4eb7079bcaefaba0040ddf8e2a5366faf8738be8b7a531781dd11d39
  • Pointer size: 131 Bytes
  • Size of remote file: 111 kB
Workflows/numbers.png ADDED

Git LFS Details

  • SHA256: 9db12b5c5856795a4497c44745e26363b4dfa371863d228dc9fec0bb41167228
  • Pointer size: 130 Bytes
  • Size of remote file: 59.4 kB
Workflows/openAI.png ADDED

Git LFS Details

  • SHA256: 4c981be88ecef5f037b7911d691f5295dbdec717d36b7c2b79ab38bac23b7984
  • Pointer size: 131 Bytes
  • Size of remote file: 197 kB
Workflows/other.png ADDED

Git LFS Details

  • SHA256: f9c77865c6abdac5df5a9c80b70ad33d31b01bcbec988e0787ee12a04ddbede1
  • Pointer size: 131 Bytes
  • Size of remote file: 119 kB
Workflows/selectLatentFromBatch.png ADDED

Git LFS Details

  • SHA256: 4ba94eceff4f90ad7db20f07678b9d4900a5d810186da026a1e8e9e93432b70e
  • Pointer size: 131 Bytes
  • Size of remote file: 226 kB
Workflows/string_o.png ADDED

Git LFS Details

  • SHA256: 83fbc74da34d4de9a1828d972a0fbb6e32476a675a14b78d6cbbd167556c5bbd
  • Pointer size: 131 Bytes
  • Size of remote file: 473 kB
__init__.py ADDED
@@ -0,0 +1,54 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+ #----------------------------------------------------------
3
+ #update logic
4
+ import os
5
+ import json
6
+
7
+
8
+ #check if config file exists
9
+ if not os.path.isfile(os.path.join(os.path.dirname(os.path.realpath(__file__)),"config.json")):
10
+ #create config file
11
+ config = {
12
+ "autoUpdate": True,
13
+ "branch": "main",
14
+ "openAI_API_Key": "sk-#########################################"
15
+ }
16
+ with open(os.path.join(os.path.dirname(os.path.realpath(__file__)),"config.json"), "w") as f:
17
+ json.dump(config, f, indent=4)
18
+
19
+ #load config file
20
+ with open(os.path.join(os.path.dirname(os.path.realpath(__file__)),"config.json"), "r") as f:
21
+ config = json.load(f)
22
+
23
+ #check if autoUpdate is set
24
+ if "autoUpdate" not in config:
25
+ config["autoUpdate"] = False
26
+
27
+ #check if version is set
28
+ if "version" not in config:
29
+ config["branch"] = "main"
30
+
31
+
32
+ if config["autoUpdate"] == True:
33
+ try:
34
+ from .update import update as update
35
+ currentPath = os.path.dirname(os.path.realpath(__file__))
36
+ #run update/update.py to update the node class mappings
37
+ update.update(currentPath,branch_name=config["branch"])
38
+ except ImportError:
39
+ print("Failed to auto update `Quality of Life Suit` ")
40
+
41
+ #----------------------------------------------------------
42
+
43
+ from .src.QualityOfLifeSuit_Omar92 import NODE_CLASS_MAPPINGS as NODE_CLASS_MAPPINGS_SUIT
44
+ try:
45
+ from .src.QualityOfLife_deprecatedNodes import NODE_CLASS_MAPPINGS as NODE_CLASS_MAPPINGS_DEPRECATED
46
+ except ImportError:
47
+ NODE_CLASS_MAPPINGS_DEPRECATED = {}
48
+
49
+
50
+ __all__ = ['NODE_CLASS_MAPPINGS_SUIT', 'NODE_CLASS_MAPPINGS_DEPRECATED', 'NODE_CLASS_MAPPINGS']
51
+ NODE_CLASS_MAPPINGS = {
52
+ **NODE_CLASS_MAPPINGS_SUIT,
53
+ **NODE_CLASS_MAPPINGS_DEPRECATED
54
+ }
__pycache__/__init__.cpython-310.pyc ADDED
Binary file (1.2 kB). View file
 
config.json ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ {
2
+ "autoUpdate": true,
3
+ "branch": "main",
4
+ "openAI_API_Key": "sk-#########################################"
5
+ }
fonts/Alkatra.ttf ADDED
Binary file (858 kB). View file
 
fonts/CALIBRI.TTF ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:df2c69a18a462e5cbc97d04a033f3bd7cd0abfe818381641f8c2dee7b7c43dbd
3
+ size 1681220
fonts/COMIC.TTF ADDED
Binary file (246 kB). View file
 
fonts/COMICI.TTF ADDED
Binary file (227 kB). View file
 
fonts/COMICZ.TTF ADDED
Binary file (224 kB). View file
 
nsp_pantry.json ADDED
The diff for this file is too large to render. See raw diff
 
src/QualityOfLifeSuit_Omar92.py ADDED
@@ -0,0 +1,1271 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Developed by Omar - https://github.com/omar92
2
+ # https://civitai.com/user/omar92
3
+ # discord: Omar92#3374
4
+
5
+ import io
6
+ import json
7
+ import os
8
+ import random
9
+ import time
10
+ from urllib.request import urlopen
11
+ import numpy as np
12
+ import requests
13
+ import torch
14
+ from PIL import Image, ImageFont, ImageDraw
15
+ import importlib
16
+ import comfy.samplers
17
+ import comfy.sd
18
+ import comfy.utils
19
+ import torch.nn as nn
20
+
21
+ MAX_RESOLUTION = 8192
22
+
23
+ # region INSTALLATION CLEANUP (thanks WAS i got this from you)
24
+ # Delete legacy nodes
25
+ legacy_nodes = ['ChatGPT_Omar92.py',
26
+ 'LatentUpscaleMultiply_Omar92.py', 'StringSuit_Omar92.py']
27
+ legacy_nodes_found = []
28
+ f_disp = False
29
+ for f in legacy_nodes:
30
+ node_path_dir = os.getcwd()+'/ComfyUI/custom_nodes/'
31
+ file = f'{node_path_dir}{f}'
32
+ if os.path.exists(file):
33
+ import zipfile
34
+ if not f_disp:
35
+ print(
36
+ '\033[33mQualityOflife Node Suite:\033[0m Found legacy nodes. Archiving legacy nodes...')
37
+ f_disp = True
38
+ legacy_nodes_found.append(file)
39
+ if legacy_nodes_found:
40
+ from os.path import basename
41
+ archive = zipfile.ZipFile(
42
+ f'{node_path_dir}QualityOflife_Backup_{round(time.time())}.zip', "w")
43
+ for f in legacy_nodes_found:
44
+ archive.write(f, basename(f))
45
+ try:
46
+ os.remove(f)
47
+ except OSError:
48
+ pass
49
+ archive.close()
50
+ if f_disp:
51
+ print('\033[33mQualityOflife Node Suite:\033[0m Legacy cleanup complete.')
52
+ # endregion
53
+
54
+ # region global
55
+ PACKAGE_NAME = '\033[33mQualityOfLifeSuit_Omar92:\033[0m'
56
+ NODE_FILE = os.path.abspath(__file__)
57
+ SUIT_DIR = (os.path.dirname(os.path.dirname(NODE_FILE))
58
+ if os.path.dirname(os.path.dirname(NODE_FILE)) == 'QualityOfLifeSuit_Omar92'
59
+ or os.path.dirname(os.path.dirname(NODE_FILE)) == 'QualityOfLifeSuit_Omar92-dev'
60
+ else os.path.dirname(NODE_FILE))
61
+ SUIT_DIR = SUIT_DIR+"\\.."
62
+ print(f'\033[33mQualityOfLifeSuit_Omar92_DIR:\033[0m {SUIT_DIR}')
63
+
64
+
65
+ def enforce_mul_of_64(d):
66
+ leftover = d % 8 # 8 is the number of pixels per byte
67
+ if leftover != 0: # if the number of pixels is not a multiple of 8
68
+ if (leftover < 4): # if the number of pixels is less than 4
69
+ d -= leftover # remove the leftover pixels
70
+ else: # if the number of pixels is more than 4
71
+ d += 8 - leftover # add the leftover pixels
72
+
73
+ return d
74
+
75
+ # endregion
76
+
77
+
78
+ # region openAITools
79
+
80
+
81
+ def install_openai():
82
+ # Helper function to install the OpenAI module if not already installed
83
+ try:
84
+ importlib.import_module('openai')
85
+ except ImportError:
86
+ import pip
87
+ pip.main(['install', 'openai'])
88
+
89
+
90
+ def get_api_key():
91
+ # Helper function to get the API key from the file
92
+ try:
93
+ #open config file
94
+ configPath = os.path.join(SUIT_DIR, "config.json")
95
+ with open(configPath, 'r') as f: # Open the file and read the API key
96
+ config = json.load(f)
97
+ api_key = config["openAI_API_Key"]
98
+ except:
99
+ print("Error: OpenAI API key file not found OpenAI features wont work for you")
100
+ return ""
101
+ return api_key # Return the API key
102
+
103
+
104
+ openAI_models = None
105
+
106
+
107
+ def get_openAI_models():
108
+ global openAI_models
109
+ if (openAI_models != None):
110
+ return openAI_models
111
+
112
+ install_openai()
113
+ import openai
114
+ # Set the API key for the OpenAI module
115
+ openai.api_key = get_api_key()
116
+
117
+ try:
118
+ models = openai.Model.list() # Get the list of models
119
+ except:
120
+ print("Error: OpenAI API key is invalid OpenAI features wont work for you")
121
+ return []
122
+
123
+ openAI_models = [] # Create a list for the chat models
124
+ for model in models["data"]: # Loop through the models
125
+ openAI_models.append(model["id"]) # Add the model to the list
126
+
127
+ return openAI_models # Return the list of chat models
128
+
129
+
130
+ openAI_gpt_models = None
131
+
132
+
133
+ def get_gpt_models():
134
+ global openAI_gpt_models
135
+ if (openAI_gpt_models != None):
136
+ return openAI_gpt_models
137
+ models = get_openAI_models()
138
+ openAI_gpt_models = [] # Create a list for the chat models
139
+ for model in models: # Loop through the models
140
+ if ("gpt" in model.lower()):
141
+ openAI_gpt_models.append(model)
142
+
143
+ return openAI_gpt_models # Return the list of chat models
144
+
145
+
146
+ class O_ChatGPT_O:
147
+ """
148
+ this node is based on the openAI GPT-3 API to generate propmpts using the AI
149
+ """
150
+ # Define the input types for the node
151
+ @classmethod
152
+ def INPUT_TYPES(cls):
153
+ return {
154
+ "required": {
155
+ # Multiline string input for the prompt
156
+ "prompt": ("STRING", {"multiline": True}),
157
+ "model": (get_gpt_models(), {"default": "gpt-3.5-turbo"}),
158
+ },
159
+ "optional": {
160
+ "seed": ("INT", {"default": 0, "min": 0, "max": 0xffffffffffffffff}),
161
+ }
162
+ }
163
+
164
+ RETURN_TYPES = ("STRING",) # Define the return type of the node
165
+ FUNCTION = "fun" # Define the function name for the node
166
+ CATEGORY = "O >>/OpenAI >>" # Define the category for the node
167
+
168
+ def fun(self, model, prompt, seed):
169
+ install_openai() # Install the OpenAI module if not already installed
170
+ import openai # Import the OpenAI module
171
+
172
+ # Get the API key from the file
173
+ api_key = get_api_key()
174
+
175
+ openai.api_key = api_key # Set the API key for the OpenAI module
176
+
177
+ # Create a chat completion using the OpenAI module
178
+ completion = openai.ChatCompletion.create(
179
+ model=model,
180
+ messages=[
181
+ {"role": "user", "content": "act as prompt generator ,i will give you text and you describe an image that match that text in details, answer with one response only"},
182
+ {"role": "user", "content": prompt}
183
+ ]
184
+ )
185
+ # Get the answer from the chat completion
186
+ answer = completion["choices"][0]["message"]["content"]
187
+ return (answer,) # Return the answer as a string
188
+ # region advanced
189
+
190
+
191
+ class load_openAI_O:
192
+ """
193
+ this node will load openAI model
194
+ """
195
+ # Define the input types for the node
196
+ @classmethod
197
+ def INPUT_TYPES(cls):
198
+ return {
199
+ "required": {
200
+ }
201
+ }
202
+ RETURN_TYPES = ("OPENAI",) # Define the return type of the node
203
+ FUNCTION = "fun" # Define the function name for the node
204
+ CATEGORY = "O >>/OpenAI >>/Advanced >>" # Define the category for the node
205
+
206
+ def fun(self):
207
+ install_openai() # Install the OpenAI module if not already installed
208
+ import openai # Import the OpenAI module
209
+
210
+ # Get the API key from the file
211
+ api_key = get_api_key()
212
+ openai.api_key = api_key # Set the API key for the OpenAI module
213
+
214
+ return (
215
+ {
216
+ "openai": openai, # Return openAI model
217
+ },
218
+ )
219
+ # region ChatGPT
220
+
221
+
222
+ class openAi_chat_message_O:
223
+ """
224
+ create chat message for openAI chatGPT
225
+ """
226
+ # Define the input types for the node
227
+ @classmethod
228
+ def INPUT_TYPES(cls):
229
+ return {
230
+ "required": {
231
+ "role": (["user", "assistant", "system"], {"default": "user"}),
232
+ "content": ("STRING", {"multiline": True, "default": "act as prompt generator ,i will give you text and you describe an image that match that text in details, answer with one response only"}),
233
+ }
234
+ }
235
+ # Define the return type of the node
236
+ RETURN_TYPES = ("OPENAI_CHAT_MESSAGES",)
237
+ FUNCTION = "fun" # Define the function name for the node
238
+ # Define the category for the node
239
+ CATEGORY = "O >>/OpenAI >>/Advanced >>/ChatGPT >>"
240
+
241
+ def fun(self, role, content):
242
+ return (
243
+ {
244
+ "messages": [{"role": role, "content": content, }]
245
+ },
246
+ )
247
+
248
+
249
+ class openAi_chat_messages_Combine_O:
250
+ """
251
+ compine chat messages into 1 tuple
252
+ """
253
+ # Define the input types for the node
254
+ @classmethod
255
+ def INPUT_TYPES(cls):
256
+ return {
257
+ "required": {
258
+ "message1": ("OPENAI_CHAT_MESSAGES", ),
259
+ "message2": ("OPENAI_CHAT_MESSAGES", ),
260
+ }
261
+ }
262
+ # Define the return type of the node
263
+ RETURN_TYPES = ("OPENAI_CHAT_MESSAGES",)
264
+ FUNCTION = "fun" # Define the function name for the node
265
+ # Define the category for the node
266
+ CATEGORY = "O >>/OpenAI >>/Advanced >>/ChatGPT >>"
267
+
268
+ def fun(self, message1, message2):
269
+ messages = message1["messages"] + \
270
+ message2["messages"] # compine messages
271
+
272
+ return (
273
+ {
274
+ "messages": messages
275
+ },
276
+ )
277
+
278
+
279
+ class openAi_chat_completion_O:
280
+ """
281
+ create chat completion for openAI chatGPT
282
+ """
283
+ # Define the input types for the node
284
+ @classmethod
285
+ def INPUT_TYPES(cls):
286
+ return {
287
+ "required": {
288
+ "openai": ("OPENAI", ),
289
+ # "model": ("STRING", {"multiline": False, "default": "gpt-3.5-turbo"}),
290
+ "model": (get_gpt_models(), {"default": "gpt-3.5-turbo"}),
291
+ "messages": ("OPENAI_CHAT_MESSAGES", ),
292
+ },
293
+ "optional": {
294
+ "seed": ("INT", {"default": 0, "min": 0, "max": 0xffffffffffffffff}),
295
+ }
296
+ }
297
+ # Define the return type of the node
298
+ RETURN_TYPES = ("STRING", "OPENAI_CHAT_COMPLETION",)
299
+ FUNCTION = "fun" # Define the function name for the node
300
+ OUTPUT_NODE = True
301
+ # Define the category for the node
302
+ CATEGORY = "O >>/OpenAI >>/Advanced >>/ChatGPT >>"
303
+
304
+ def fun(self, openai, model, messages, seed):
305
+ # Create a chat completion using the OpenAI module
306
+ openai = openai["openai"]
307
+ completion = openai.ChatCompletion.create(
308
+ model=model,
309
+ messages=messages["messages"]
310
+ )
311
+ # Get the answer from the chat completion
312
+ content = completion["choices"][0]["message"]["content"]
313
+ return (
314
+ content, # Return the answer as a string
315
+ completion, # Return the chat completion
316
+ )
317
+
318
+
319
+ class DebugOpenAIChatMEssages_O:
320
+ """
321
+ Debug OpenAI Chat Messages
322
+ """
323
+ # Define the input types for the node
324
+ @classmethod
325
+ def INPUT_TYPES(cls):
326
+ return {
327
+ "required": {
328
+ "messages": ("OPENAI_CHAT_MESSAGES", ),
329
+ }
330
+ }
331
+ # Define the return type of the node
332
+ RETURN_TYPES = ()
333
+ FUNCTION = "fun" # Define the function name for the node
334
+ OUTPUT_NODE = True
335
+ # Define the category for the node
336
+ CATEGORY = "O >>/OpenAI >>/Advanced >>/ChatGPT >>"
337
+
338
+ def fun(self, messages):
339
+ print(f'{PACKAGE_NAME}:OpenAIChatMEssages', messages["messages"])
340
+ return ()
341
+
342
+
343
+ class DebugOpenAIChatCompletion_O:
344
+ """
345
+ Debug OpenAI Chat Completion
346
+ """
347
+ # Define the input types for the node
348
+ @classmethod
349
+ def INPUT_TYPES(cls):
350
+ return {
351
+ "required": {
352
+ "completion": ("OPENAI_CHAT_COMPLETION", ),
353
+ }
354
+ }
355
+ # Define the return type of the node
356
+ RETURN_TYPES = ()
357
+ FUNCTION = "fun" # Define the function name for the node
358
+ OUTPUT_NODE = True
359
+ # Define the category for the node
360
+ CATEGORY = "O >>/OpenAI >>/Advanced >>/ChatGPT >>"
361
+
362
+ def fun(self, completion):
363
+ print(f'{PACKAGE_NAME}:OpenAIChatCompletion:', completion)
364
+ return ()
365
+ # endregion ChatGPT
366
+ # region Image
367
+
368
+
369
+ class openAi_Image_create_O:
370
+ """
371
+ create image using openai
372
+ """
373
+ # Define the input types for the node
374
+ @classmethod
375
+ def INPUT_TYPES(cls):
376
+ return {
377
+ "required": {
378
+ "openai": ("OPENAI", ),
379
+ "prompt": ("STRING", {"multiline": True}),
380
+ "number": ("INT", {"default": 1, "min": 1, "max": 10, "step": 1}),
381
+ "size": (["256x256", "512x512", "1024x1024"], {"default": "256x256"}),
382
+ },
383
+ "optional": {
384
+ "seed": ("INT", {"default": 0, "min": 0, "max": 0xffffffffffffffff}),
385
+ }
386
+ }
387
+ # Define the return type of the node
388
+ RETURN_TYPES = ("IMAGE", "MASK")
389
+ FUNCTION = "fun" # Define the function name for the node
390
+ OUTPUT_NODE = True
391
+ # Define the category for the node
392
+ CATEGORY = "O >>/OpenAI >>/Advanced >>/Image >>"
393
+
394
+ def fun(self, openai, prompt, number, size, seed):
395
+ # Create a chat completion using the OpenAI module
396
+ openai = openai["openai"]
397
+ prompt = prompt
398
+ number = 1
399
+
400
+ imageURL = ""
401
+ try:
402
+ imagesURLS = openai.Image.create(
403
+ prompt=prompt,
404
+ n=number,
405
+ size=size
406
+ )
407
+ imageURL = imagesURLS["data"][0]["url"]
408
+ except Exception as e:
409
+ print(f'{PACKAGE_NAME}:openAi_Image_create_O:', e)
410
+ imageURL = "https://i.imgur.com/removed.png"
411
+
412
+
413
+ image = requests.get(imageURL).content
414
+ i = Image.open(io.BytesIO(image))
415
+ image = i.convert("RGBA")
416
+ image = np.array(image).astype(np.float32) / 255.0
417
+ # image_np = np.transpose(image_np, (2, 0, 1))
418
+ image = torch.from_numpy(image)[None,]
419
+ if 'A' in i.getbands():
420
+ mask = np.array(i.getchannel('A')).astype(np.float32) / 255.0
421
+ mask = 1. - torch.from_numpy(mask)
422
+ else:
423
+ mask = torch.zeros((64, 64), dtype=torch.float32, device="cpu")
424
+ return (image, mask)
425
+
426
+
427
+ class openAi_Image_Edit_O:
428
+ """
429
+ edit an image using openai
430
+ """
431
+ # Define the input types for the node
432
+ @classmethod
433
+ def INPUT_TYPES(cls):
434
+ return {
435
+ "required": {
436
+ "openai": ("OPENAI", ),
437
+ "image": ("IMAGE",),
438
+ "prompt": ("STRING", {"multiline": True}),
439
+ "number": ("INT", {"default": 1, "min": 1, "max": 10, "step": 1}),
440
+ "size": (["256x256", "512x512", "1024x1024"], {"default": "256x256"}),
441
+ },
442
+ "optional": {
443
+ "seed": ("INT", {"default": 0, "min": 0, "max": 0xffffffffffffffff}),
444
+ }
445
+ }
446
+ # Define the return type of the node
447
+ RETURN_TYPES = ("IMAGE", "MASK")
448
+ FUNCTION = "fun" # Define the function name for the node
449
+ OUTPUT_NODE = True
450
+ # Define the category for the node
451
+ CATEGORY = "O >>/OpenAI >>/Advanced >>/Image >>"
452
+
453
+ def fun(self, openai, image, prompt, number, size, seed):
454
+ # Create a chat completion using the OpenAI module
455
+ openai = openai["openai"]
456
+ prompt = prompt
457
+ number = 1
458
+
459
+ # Convert PyTorch tensor to NumPy array
460
+ image = image[0]
461
+ i = 255. * image.cpu().numpy()
462
+ img = Image.fromarray(np.clip(i, 0, 255).astype(np.uint8))
463
+ # Save the image to a BytesIO object as a PNG file
464
+ with io.BytesIO() as output:
465
+ img.save(output, format='PNG')
466
+ binary_image = output.getvalue()
467
+
468
+ # Create a circular mask with alpha 0 in the middle
469
+ mask = np.zeros((image.shape[0], image.shape[1], 4), dtype=np.uint8)
470
+ center = (image.shape[1] // 2, image.shape[0] // 2)
471
+ radius = min(center[0], center[1])
472
+ draw = ImageDraw.Draw(Image.fromarray(mask, mode='RGBA'))
473
+ draw.ellipse((center[0]-radius, center[1]-radius, center[0]+radius,
474
+ center[1]+radius), fill=(0, 0, 0, 255), outline=(0, 0, 0, 0))
475
+ del draw
476
+ # Save the mask to a BytesIO object as a PNG file
477
+ with io.BytesIO() as output:
478
+ Image.fromarray(mask, mode='RGBA').save(output, format='PNG')
479
+ binary_mask = output.getvalue()
480
+
481
+ imageURL = ""
482
+ try:
483
+ imagesURLS = openai.Image.create_edit(
484
+ image=binary_image,
485
+ mask=binary_mask,
486
+ prompt=prompt,
487
+ n=number,
488
+ size=size
489
+ )
490
+ imageURL = imagesURLS["data"][0]["url"]
491
+ except Exception as e:
492
+ print(f'{PACKAGE_NAME}:openAi_Image_create_O:', e)
493
+ imageURL = "https://i.imgur.com/removed.png"
494
+
495
+ image = requests.get(imageURL).content
496
+ i = Image.open(io.BytesIO(image))
497
+ image = i.convert("RGBA")
498
+ image = np.array(image).astype(np.float32) / 255.0
499
+ # image_np = np.transpose(image_np, (2, 0, 1))
500
+ image = torch.from_numpy(image)[None,]
501
+ if 'A' in i.getbands():
502
+ mask = np.array(i.getchannel('A')).astype(np.float32) / 255.0
503
+ mask = 1. - torch.from_numpy(mask)
504
+ else:
505
+ mask = torch.zeros(
506
+ (1, image.shape[2], image.shape[3]), dtype=torch.float32, device="cpu")
507
+ return (image, mask)
508
+
509
+
510
+ class openAi_Image_variation_O:
511
+ """
512
+ edit an image using openai
513
+ """
514
+ # Define the input types for the node
515
+ @classmethod
516
+ def INPUT_TYPES(cls):
517
+ return {
518
+ "required": {
519
+ "openai": ("OPENAI", ),
520
+ "image": ("IMAGE",),
521
+ "number": ("INT", {"default": 1, "min": 1, "max": 10, "step": 1}),
522
+ "size": (["256x256", "512x512", "1024x1024"], {"default": "256x256"}),
523
+ },
524
+ "optional": {
525
+ "seed": ("INT", {"default": 0, "min": 0, "max": 0xffffffffffffffff}),
526
+ }
527
+ }
528
+ # Define the return type of the node
529
+ RETURN_TYPES = ("IMAGE", "MASK")
530
+ FUNCTION = "fun" # Define the function name for the node
531
+ OUTPUT_NODE = True
532
+ # Define the category for the node
533
+ CATEGORY = "O >>/OpenAI >>/Advanced >>/Image >>"
534
+
535
+ def fun(self, openai, image, number, size, seed):
536
+ # Create a chat completion using the OpenAI module
537
+ openai = openai["openai"]
538
+ number = 1
539
+
540
+ # Convert PyTorch tensor to NumPy array
541
+ image = image[0]
542
+ i = 255. * image.cpu().numpy()
543
+ img = Image.fromarray(np.clip(i, 0, 255).astype(np.uint8))
544
+ # Save the image to a BytesIO object as a PNG file
545
+ with io.BytesIO() as output:
546
+ img.save(output, format='PNG')
547
+ binary_image = output.getvalue()
548
+
549
+ imageURL = " "
550
+ try:
551
+ imagesURLS = openai.Image.create_variation(
552
+ image=binary_image,
553
+ n=number,
554
+ size=size
555
+ )
556
+ imageURL = imagesURLS["data"][0]["url"]
557
+ except Exception as e:
558
+ print(f'{PACKAGE_NAME}:openAi_Image_create_O:', e)
559
+ imageURL = "https://i.imgur.com/removed.png"
560
+
561
+ image = requests.get(imageURL).content
562
+ i = Image.open(io.BytesIO(image))
563
+ image = i.convert("RGBA")
564
+ image = np.array(image).astype(np.float32) / 255.0
565
+ # image_np = np.transpose(image_np, (2, 0, 1))
566
+ image = torch.from_numpy(image)[None,]
567
+ if 'A' in i.getbands():
568
+ mask = np.array(i.getchannel('A')).astype(np.float32) / 255.0
569
+ mask = 1. - torch.from_numpy(mask)
570
+ else:
571
+ mask = torch.zeros(
572
+ (1, image.shape[2], image.shape[3]), dtype=torch.float32, device="cpu")
573
+ return (image, mask)
574
+ # endregion Image
575
+ # endregion advanced
576
+ # endregion openAI
577
+
578
+ # region latentTools
579
+
580
+
581
+ class LatentUpscaleFactor_O:
582
+ """
583
+ Upscale the latent code by multiplying the width and height by a factor
584
+ """
585
+ upscale_methods = ["nearest-exact", "bilinear", "area"]
586
+ crop_methods = ["disabled", "center"]
587
+
588
+ @classmethod
589
+ def INPUT_TYPES(cls):
590
+ return {
591
+ "required": {
592
+ "samples": ("LATENT",),
593
+ "upscale_method": (cls.upscale_methods,),
594
+ "WidthFactor": ("FLOAT", {"default": 1.25, "min": 0.0, "max": 10.0, "step": 0.28125}),
595
+ "HeightFactor": ("FLOAT", {"default": 1.25, "min": 0.0, "max": 10.0, "step": 0.28125}),
596
+ "crop": (cls.crop_methods,),
597
+ }
598
+ }
599
+
600
+ RETURN_TYPES = ("LATENT",)
601
+ FUNCTION = "upscale"
602
+ CATEGORY = "O >>/latent >>"
603
+
604
+ def upscale(self, samples, upscale_method, WidthFactor, HeightFactor, crop):
605
+ s = samples.copy()
606
+ x = samples["samples"].shape[3]
607
+ y = samples["samples"].shape[2]
608
+
609
+ new_x = int(x * WidthFactor)
610
+ new_y = int(y * HeightFactor)
611
+
612
+ if (new_x > MAX_RESOLUTION):
613
+ new_x = MAX_RESOLUTION
614
+ if (new_y > MAX_RESOLUTION):
615
+ new_y = MAX_RESOLUTION
616
+
617
+ print(f'{PACKAGE_NAME}:upscale from ({x*8},{y*8}) to ({new_x*8},{new_y*8})')
618
+
619
+ s["samples"] = comfy.utils.common_upscale(
620
+ samples["samples"], enforce_mul_of_64(
621
+ new_x), enforce_mul_of_64(new_y), upscale_method, crop
622
+ )
623
+ return (s,)
624
+
625
+
626
+ class SelectLatentImage_O:
627
+ """
628
+ Select a single image from a batch of generated latent images.
629
+ """
630
+ @classmethod
631
+ def INPUT_TYPES(cls):
632
+ return {
633
+ "required": {
634
+ "samples": ("LATENT",),
635
+ "index": ("INT", {"default": 0, "min": 0}),
636
+ }
637
+ }
638
+
639
+ RETURN_TYPES = ("LATENT",)
640
+ FUNCTION = "fun"
641
+ CATEGORY = "O >>/latent >>"
642
+
643
+ def fun(self, samples, index):
644
+ # Get the batch size and number of channels
645
+ batch_size, num_channels, height, width = samples["samples"].shape
646
+
647
+ # Ensure that the index is within bounds
648
+ if index >= batch_size:
649
+ index = batch_size - 1
650
+
651
+ # Select the specified image
652
+ selected_image = samples["samples"][index].unsqueeze(0)
653
+
654
+ # Return the selected image
655
+ return ({"samples": selected_image},)
656
+
657
+
658
+ class VAEDecodeParallel_O:
659
+ def __init__(self, device="cpu"):
660
+ self.device = device
661
+ self.device_count = torch.cuda.device_count() if device != "cpu" else 1
662
+ self.module = VAEDecodeOriginal(device)
663
+ self.net = nn.DataParallel(self.module)
664
+
665
+ @classmethod
666
+ def INPUT_TYPES(cls):
667
+ return {"required": {"samples": ("LATENT", ), "vae": ("VAE", )}}
668
+
669
+ RETURN_TYPES = ("IMAGE",)
670
+ FUNCTION = "decode_parallel"
671
+ CATEGORY = "latent"
672
+
673
+ def decode_parallel(self, vae, samples):
674
+ batch_size = samples["samples"].shape[0]
675
+ images = torch.zeros((batch_size, 3, 256, 256)).to(self.device)
676
+
677
+ for i in range(0, batch_size, self.device_count):
678
+ batch_samples = samples["samples"][i:i +
679
+ self.device_count].to(self.device)
680
+ batch_images = self.net(vae, {"samples": batch_samples})[
681
+ 0].to(self.device)
682
+ images[i:i+self.device_count] = batch_images
683
+
684
+ return (images,)
685
+
686
+
687
+ class VAEDecodeOriginal:
688
+ def __init__(self, device="cpu"):
689
+ self.device = device
690
+
691
+ @classmethod
692
+ def INPUT_TYPES(s):
693
+ return {"required": {"samples": ("LATENT", ), "vae": ("VAE", )}}
694
+ RETURN_TYPES = ("IMAGE",)
695
+ FUNCTION = "decode"
696
+
697
+ CATEGORY = "latent"
698
+
699
+ def decode(self, vae, samples):
700
+ return (vae.decode(samples["samples"]), )
701
+
702
+ # endregion latentTools
703
+
704
+ # region TextTools
705
+
706
+
707
+ class seed2String_O:
708
+ """
709
+ This node convert seeds to string // can be used to force the system to read a string again if it got compined with it
710
+ """
711
+ @classmethod
712
+ def INPUT_TYPES(cls):
713
+ return {"required": {"seed": ("SEED")}}
714
+
715
+ RETURN_TYPES = ("STRING")
716
+ FUNCTION = "fun"
717
+ CATEGORY = "O >>/utils >>"
718
+
719
+ def fun(self, seed):
720
+ return (str(seed))
721
+
722
+
723
+ class DebugText_O:
724
+ """
725
+ This node will write a text to the console
726
+ """
727
+ @classmethod
728
+ def INPUT_TYPES(cls):
729
+ return {"required": {
730
+ "text": ("STRING", {"multiline": True}),
731
+ "prefix": ("STRING", {"default": "debug", "multiline": False}),
732
+ }}
733
+
734
+ RETURN_TYPES = ()
735
+ FUNCTION = "debug_string"
736
+ OUTPUT_NODE = True
737
+ CATEGORY = "O >>/text >>"
738
+
739
+ @staticmethod
740
+ def debug_string(text, prefix):
741
+ print(f'{PACKAGE_NAME}:{prefix}:{text}')
742
+ return ()
743
+
744
+
745
+ fonts = None
746
+
747
+
748
+ def loadFonts():
749
+
750
+ global fonts
751
+ if (fonts != None):
752
+ return fonts
753
+ try:
754
+ fonts_filepath = os.path.join(SUIT_DIR, "fonts")
755
+ fonts = []
756
+ for file in os.listdir(fonts_filepath):
757
+ if file.endswith(".ttf") or file.endswith(".otf") or file.endswith(".ttc") or file.endswith(".TTF") or file.endswith(".OTF") or file.endswith(".TTC"):
758
+ fonts.append(file)
759
+ except:
760
+ fonts = []
761
+
762
+ if (len(fonts) == 0):
763
+ print(f'{PACKAGE_NAME}:no fonts found in {fonts_filepath}')
764
+ fonts = ["Arial.ttf"]
765
+ return fonts
766
+
767
+
768
+ class Text2Image_O:
769
+ """
770
+ This node will convert a string to an image
771
+ """
772
+
773
+ def __init__(self):
774
+ self.font_filepath = os.path.join(SUIT_DIR, "fonts")
775
+
776
+ @classmethod
777
+ def INPUT_TYPES(s):
778
+ return {
779
+ "required": {
780
+ "text": ("STRING", {"multiline": True}),
781
+ "font": (loadFonts(), {"default": loadFonts()[0], }),
782
+ "size": ("INT", {"default": 36, "min": 0, "max": 255, "step": 1}),
783
+ "font_R": ("INT", {"default": 0, "min": 0, "max": 255, "step": 1}),
784
+ "font_G": ("INT", {"default": 0, "min": 0, "max": 255, "step": 1}),
785
+ "font_B": ("INT", {"default": 0, "min": 0, "max": 255, "step": 1}),
786
+ "font_A": ("INT", {"default": 255, "min": 0, "max": 255, "step": 1}),
787
+ "background_R": ("INT", {"default": 255, "min": 0, "max": 255, "step": 1}),
788
+ "background_G": ("INT", {"default": 255, "min": 0, "max": 255, "step": 1}),
789
+ "background_B": ("INT", {"default": 255, "min": 0, "max": 255, "step": 1}),
790
+ "background_A": ("INT", {"default": 255, "min": 0, "max": 255, "step": 1}),
791
+ "width": ("INT", {"default": 128, "min": 0, "step": 1}),
792
+ "height": ("INT", {"default": 128, "min": 0, "step": 1}),
793
+ "expand": (["true", "false"], {"default": "true"}),
794
+ "x": ("INT", {"default": 0, "min": -100, "step": 1}),
795
+ "y": ("INT", {"default": 0, "min": -100, "step": 1}),
796
+ }
797
+ }
798
+
799
+ RETURN_TYPES = ("IMAGE",)
800
+ FUNCTION = "create_image_new"
801
+ OUTPUT_NODE = False
802
+ CATEGORY = "O >>/text >>"
803
+
804
+ def create_image_new(self, text, font, size, font_R, font_G, font_B, font_A, background_R, background_G, background_B, background_A, width, height, expand, x, y):
805
+ font_color = (font_R, font_G, font_B, font_A)
806
+ background_color = (background_R, background_G,
807
+ background_B, background_A)
808
+
809
+ font_path = os.path.join(self.font_filepath, font)
810
+ font = ImageFont.truetype(font_path, size)
811
+
812
+ # Initialize the drawing context
813
+ image = Image.new('RGBA', (1, 1), color=background_color)
814
+ draw = ImageDraw.Draw(image)
815
+
816
+ # Get the size of the text
817
+ text_width, text_height = draw.textsize(text, font=font)
818
+
819
+ # Set the dimensions of the image
820
+ if expand == "true":
821
+ if width < text_width:
822
+ width = text_width
823
+ if height < text_height:
824
+ height = text_height
825
+
826
+ width = enforce_mul_of_64(width)
827
+ height = enforce_mul_of_64(height)
828
+
829
+ # Create a new image
830
+ image = Image.new('RGBA', (width, height), color=background_color)
831
+
832
+ # Initialize the drawing context
833
+ draw = ImageDraw.Draw(image)
834
+
835
+ # Calculate the position of the text
836
+ text_x = x - text_width/2
837
+ if (text_x < 0):
838
+ text_x = 0
839
+ if (text_x > width-text_width):
840
+ text_x = width - text_width
841
+
842
+ text_y = y - text_height/2
843
+ if (text_y < 0):
844
+ text_y = 0
845
+ if (text_y > height-text_height):
846
+ text_y = height - text_height
847
+
848
+ # Draw the text on the image
849
+ draw.text((text_x, text_y), text, fill=font_color, font=font)
850
+
851
+ # Convert the PIL Image to a tensor
852
+ image_np = np.array(image).astype(np.float32) / 255.0
853
+ image_tensor = torch.from_numpy(image_np).unsqueeze(0)
854
+ return image_tensor, {"ui": {"images": image_tensor}}
855
+ # region text/NSP
856
+
857
+
858
+ nspterminology = None # Cache the NSP terminology
859
+
860
+
861
+ def laodNSP():
862
+ global nspterminology
863
+ if (nspterminology != None):
864
+ return nspterminology
865
+ # Fetch the NSP Pantry
866
+ local_pantry = os.getcwd()+'/ComfyUI/custom_nodes/nsp_pantry.json'
867
+ if not os.path.exists(local_pantry):
868
+ print(f'{PACKAGE_NAME}:downlaoding NSP')
869
+ response = urlopen(
870
+ 'https://raw.githubusercontent.com/WASasquatch/noodle-soup-prompts/main/nsp_pantry.json')
871
+ tmp_pantry = json.loads(response.read())
872
+ # Dump JSON locally
873
+ pantry_serialized = json.dumps(tmp_pantry, indent=4)
874
+ with open(local_pantry, "w") as f:
875
+ f.write(pantry_serialized)
876
+ del response, tmp_pantry
877
+
878
+ # Load local pantry
879
+ with open(local_pantry, 'r') as f:
880
+ nspterminology = json.load(f)
881
+
882
+ print(f'{PACKAGE_NAME}:NSP ready')
883
+ return nspterminology
884
+
885
+
886
+ class RandomNSP_O:
887
+ @classmethod
888
+ def laodCategories(s):
889
+ nspterminology = laodNSP()
890
+ terminologies = []
891
+ for term in nspterminology:
892
+ terminologies.append(term)
893
+
894
+ return (terminologies)
895
+
896
+ @classmethod
897
+ def INPUT_TYPES(s):
898
+ return {"required": {
899
+ "terminology": (s.laodCategories(),),
900
+ "seed": ("INT", {"default": 0, "min": 0, "max": 0xffffffffffffffff}),
901
+ }}
902
+ RETURN_TYPES = ("STRING",)
903
+ FUNCTION = "fun"
904
+
905
+ CATEGORY = "O >>/text >>/NSP >>"
906
+
907
+ def fun(self, terminology, seed):
908
+
909
+ nspterminology = laodNSP()
910
+ # Set the seed
911
+ random.seed(seed)
912
+
913
+ result = random.choice(nspterminology[terminology])
914
+ return (result, {"ui": {"STRING": result}})
915
+
916
+ # endregion text/NSP
917
+
918
+ # region text/operations
919
+
920
+
921
+ class concat_text_O:
922
+ """
923
+ This node will concatenate two strings together
924
+ """
925
+ @ classmethod
926
+ def INPUT_TYPES(cls):
927
+ return {"required": {
928
+ "text1": ("STRING", {"multiline": True}),
929
+ "separator": ("STRING", {"multiline": False, "default": ","}),
930
+ "text2": ("STRING", {"multiline": True})
931
+ }}
932
+
933
+ RETURN_TYPES = ("STRING",)
934
+ FUNCTION = "fun"
935
+ CATEGORY = "O >>/text >>/operations >>"
936
+
937
+ @ staticmethod
938
+ def fun(text1, separator, text2):
939
+ return (text1 + separator + text2,)
940
+
941
+
942
+ class trim_text_O:
943
+ """
944
+ This node will trim a string from the left and right
945
+ """
946
+ @ classmethod
947
+ def INPUT_TYPES(cls):
948
+ return {"required": {
949
+ "text": ("STRING", {"multiline": True}),
950
+ }}
951
+
952
+ RETURN_TYPES = ("STRING",)
953
+ FUNCTION = "fun"
954
+ CATEGORY = "O >>/text >>/operations >>"
955
+
956
+ def fun(self, text):
957
+ return (text.strip(),)
958
+
959
+
960
+ class replace_text_O:
961
+ """
962
+ This node will replace a string with another string
963
+ """
964
+ @ classmethod
965
+ def INPUT_TYPES(cls):
966
+ return {"required": {
967
+ "text": ("STRING", {"multiline": True}),
968
+ "old": ("STRING", {"multiline": False}),
969
+ "new": ("STRING", {"multiline": False})
970
+ }}
971
+
972
+ RETURN_TYPES = ("STRING",)
973
+ FUNCTION = "fun"
974
+ CATEGORY = "O >>/text >>/operations >>"
975
+
976
+ @ staticmethod
977
+ def fun(text, old, new):
978
+ return (text.replace(old, new),) # replace a text with another text
979
+ # endregion
980
+ # endregion TextTools
981
+
982
+ # region Image
983
+
984
+
985
+ class ImageScaleFactor_O:
986
+ upscale_methods = ["nearest-exact", "bilinear", "area"]
987
+ crop_methods = ["disabled", "center"]
988
+ toggle = ["enabled", "disabled"]
989
+
990
+ @classmethod
991
+ def INPUT_TYPES(s):
992
+ return {"required": {"image": ("IMAGE",),
993
+ "upscale_method": (s.upscale_methods,),
994
+ "WidthFactor": ("FLOAT", {"default": 1.25, "min": 0.0, "max": 10.0, "step": 0.28125}),
995
+ "HeightFactor": ("FLOAT", {"default": 1.25, "min": 0.0, "max": 10.0, "step": 0.28125}),
996
+ "MulOf46": (s.toggle, {"default": "enabled"}),
997
+ "crop": (s.crop_methods,)
998
+ }}
999
+ RETURN_TYPES = ("IMAGE",)
1000
+ FUNCTION = "upscale"
1001
+
1002
+ CATEGORY = "O >>/image >>"
1003
+
1004
+ def upscale(self, image, upscale_method, WidthFactor, HeightFactor, crop, MulOf46):
1005
+ samples = image.movedim(-1, 1)
1006
+ width = WidthFactor * samples.shape[2]
1007
+ height = HeightFactor * samples.shape[3]
1008
+ if (width > MAX_RESOLUTION):
1009
+ width = MAX_RESOLUTION
1010
+ if (height > MAX_RESOLUTION):
1011
+ height = MAX_RESOLUTION
1012
+
1013
+ if (MulOf46 == "enabled"):
1014
+ width = enforce_mul_of_64(width)
1015
+ height = enforce_mul_of_64(height)
1016
+
1017
+ width = int(width)
1018
+ height = int(height)
1019
+ print(
1020
+ f'{PACKAGE_NAME}:upscale from ({samples.shape[2]},{samples.shape[3]}) to ({width},{height})')
1021
+ s = comfy.utils.common_upscale(
1022
+ samples, width, height, upscale_method, crop)
1023
+ s = s.movedim(1, -1)
1024
+ return (s,)
1025
+ # endregion
1026
+
1027
+ # region numbers
1028
+
1029
+
1030
+ def solveEquation(equation):
1031
+ answer = 0.0
1032
+ # Check if v is a valid equation or a number using regular expressions
1033
+ try:
1034
+ # Solve the equation using Python's built-in eval function
1035
+ answer = eval(equation)
1036
+ except Exception as e:
1037
+ print(f'{PACKAGE_NAME}: equation is not valid: {equation} error: {e}')
1038
+ answer = "NAN"
1039
+
1040
+ return answer
1041
+
1042
+
1043
+ class applyEquation1param_O:
1044
+ """
1045
+ This node generate seeds for the model
1046
+ """
1047
+ @classmethod
1048
+ def INPUT_TYPES(cls):
1049
+ return {"required": {
1050
+ "x": ("FLOAT", {"default": 0.0, "min": 0.0, "max": 0xffffffffffffffff}),
1051
+ "equation": ("STRING", {"multiline": True, "default": "x+1"}),
1052
+ }
1053
+ }
1054
+
1055
+ RETURN_TYPES = ("FLOAT",)
1056
+ FUNCTION = "fun"
1057
+ CATEGORY = "O >>/numbers >>"
1058
+
1059
+ def fun(self, x, equation):
1060
+ equation = equation.replace("x", "("+str(x)+")")
1061
+ answer = solveEquation(equation)
1062
+ return (answer,)
1063
+
1064
+
1065
+ class applyEquation2params_O:
1066
+ """
1067
+ This node generate seeds for the model
1068
+ """
1069
+ @classmethod
1070
+ def INPUT_TYPES(cls):
1071
+ return {"required": {
1072
+ "x": ("FLOAT", {"default": 0.0, "min": 0.0, "max": 0xffffffffffffffff}),
1073
+ "y": ("FLOAT", {"default": 0.0, "min": 0.0, "max": 0xffffffffffffffff}),
1074
+ "equation": ("STRING", {"multiline": True, "default": "x+y"}),
1075
+ }}
1076
+
1077
+ RETURN_TYPES = ("FLOAT",)
1078
+ FUNCTION = "fun"
1079
+ CATEGORY = "O >>/numbers >>"
1080
+
1081
+ def fun(self, x, y, equation):
1082
+ equation = equation.replace("x", "("+str(x)+")")
1083
+ equation = equation.replace("y", "("+str(y)+")")
1084
+ answer = solveEquation(equation)
1085
+ return (answer,)
1086
+
1087
+
1088
+ class floatToInt_O:
1089
+ """
1090
+ This node convert float to int
1091
+ """
1092
+ @classmethod
1093
+ def INPUT_TYPES(cls):
1094
+ return {"required": {
1095
+ "float": ("FLOAT", {"default": 0.0, "min": 0.0, "max": 0xffffffffffffffff}),
1096
+ }
1097
+ }
1098
+
1099
+ RETURN_TYPES = ("INT",)
1100
+ FUNCTION = "fun"
1101
+ CATEGORY = "O >>/numbers >>"
1102
+
1103
+ def fun(self, float):
1104
+ return (int(float),)
1105
+
1106
+
1107
+ class intToFloat_O:
1108
+ """
1109
+ This node convert int to float
1110
+ """
1111
+ @classmethod
1112
+ def INPUT_TYPES(cls):
1113
+ return {"required": {
1114
+ "int": ("INT", {"default": 0, "min": 0, "max": 0xffffffffffffffff}),
1115
+ }
1116
+ }
1117
+
1118
+ RETURN_TYPES = ("FLOAT",)
1119
+ FUNCTION = "fun"
1120
+ CATEGORY = "O >>/numbers >>"
1121
+
1122
+ def fun(self, int):
1123
+ return (float(int),)
1124
+
1125
+
1126
+ class floatToText_O:
1127
+ """
1128
+ This node convert float to text
1129
+ """
1130
+ @classmethod
1131
+ def INPUT_TYPES(cls):
1132
+ return {"required": {
1133
+ "float": ("FLOAT", {"default": 0.0, "min": 0.0, "max": 0xffffffffffffffff}),
1134
+ }
1135
+ }
1136
+
1137
+ RETURN_TYPES = ("STRING",)
1138
+ FUNCTION = "fun"
1139
+ CATEGORY = "O >>/numbers >>"
1140
+
1141
+ def fun(self, float):
1142
+ return (str(float),)
1143
+ # endregion
1144
+
1145
+ # region Utils
1146
+
1147
+
1148
+ class Text_O:
1149
+ """
1150
+ to provide text to the model
1151
+ """
1152
+ @classmethod
1153
+ def INPUT_TYPES(cls):
1154
+ return {
1155
+ "required": {
1156
+ "text": ("STRING", {"multiline": True}),
1157
+ }
1158
+ }
1159
+
1160
+ RETURN_TYPES = ("STRING",)
1161
+ FUNCTION = "fun"
1162
+ CATEGORY = "O >>/utils >>"
1163
+
1164
+ def fun(self, text):
1165
+ return (text+" ",)
1166
+
1167
+
1168
+ class seed_O:
1169
+ """
1170
+ This node generate seeds for the model
1171
+ """
1172
+ @classmethod
1173
+ def INPUT_TYPES(cls):
1174
+ return {"required": {"seed": ("INT", {"default": 0, "min": 0, "max": 0xffffffffffffffff}), }}
1175
+
1176
+ RETURN_TYPES = ("INT",)
1177
+ FUNCTION = "fun"
1178
+ CATEGORY = "O >>/utils >>"
1179
+
1180
+ def fun(self, seed):
1181
+ return (seed,)
1182
+
1183
+
1184
+ class int_O:
1185
+ """
1186
+ This node generate seeds for the model
1187
+ """
1188
+ @classmethod
1189
+ def INPUT_TYPES(cls):
1190
+ return {"required": {"int": ("INT", {"default": 0, "min": 0, "max": 0xffffffffffffffff}), }}
1191
+
1192
+ RETURN_TYPES = ("INT",)
1193
+ FUNCTION = "fun"
1194
+ CATEGORY = "O >>/utils >>"
1195
+
1196
+ def fun(self, int):
1197
+ return (int,)
1198
+
1199
+
1200
+ class float_O:
1201
+ """
1202
+ This node generate seeds for the model
1203
+ """
1204
+ @classmethod
1205
+ def INPUT_TYPES(cls):
1206
+ return {"required": {"float": ("FLOAT", {"default": 0.0, "min": 0.0, "max": 0xffffffffffffffff}), }}
1207
+
1208
+ RETURN_TYPES = ("FLOAT",)
1209
+ FUNCTION = "fun"
1210
+ CATEGORY = "O >>/utils >>"
1211
+
1212
+ def fun(self, float):
1213
+ return (float,)
1214
+
1215
+
1216
+ class Note_O:
1217
+ @classmethod
1218
+ def INPUT_TYPES(s):
1219
+ return {"required": {"text": ("STRING", {"multiline": True})}}
1220
+ RETURN_TYPES = ()
1221
+ FUNCTION = "fun"
1222
+ OUTPUT_NODE = True
1223
+ CATEGORY = "O >>/utils >>"
1224
+
1225
+ def fun(self, text):
1226
+ return ()
1227
+ # endregion
1228
+
1229
+
1230
+ # Define the node class mappings
1231
+ NODE_CLASS_MAPPINGS = {
1232
+ # openAITools------------------------------------------
1233
+ "ChatGPT Simple _O": O_ChatGPT_O,
1234
+ # openAiTools > Advanced
1235
+ "load_openAI _O": load_openAI_O,
1236
+ # openAiTools > Advanced > ChatGPT
1237
+ "Chat_Message _O": openAi_chat_message_O,
1238
+ "combine_chat_messages _O": openAi_chat_messages_Combine_O,
1239
+ "Chat completion _O": openAi_chat_completion_O,
1240
+ "debug messages_O": DebugOpenAIChatMEssages_O,
1241
+ "debug Completeion _O": DebugOpenAIChatCompletion_O,
1242
+ # openAiTools > Advanced > image
1243
+ "create image _O": openAi_Image_create_O,
1244
+ # "Edit_image _O": openAi_Image_Edit, # coming soon
1245
+ "variation_image _O": openAi_Image_variation_O,
1246
+ # latentTools------------------------------------------
1247
+ "LatentUpscaleFactor _O": LatentUpscaleFactor_O,
1248
+ "selectLatentFromBatch _O": SelectLatentImage_O,
1249
+ # "VAEDecodeParallel _O": VAEDecodeParallel_O, # coming soon
1250
+ # StringTools------------------------------------------
1251
+ "Debug Text _O": DebugText_O,
1252
+ "RandomNSP _O": RandomNSP_O,
1253
+ "Concat Text _O": concat_text_O,
1254
+ "Trim Text _O": trim_text_O,
1255
+ "Replace Text _O": replace_text_O,
1256
+ "Text2Image _O": Text2Image_O,
1257
+ # ImageTools------------------------------------------
1258
+ "ImageScaleFactor _O": ImageScaleFactor_O,
1259
+ # NumberTools------------------------------------------
1260
+ "Equation1param _O": applyEquation1param_O,
1261
+ "Equation2params _O": applyEquation2params_O,
1262
+ "floatToInt _O": floatToInt_O,
1263
+ "intToFloat _O": intToFloat_O,
1264
+ "floatToText _O": floatToText_O,
1265
+ # Utils------------------------------------------
1266
+ "Note _O": Note_O,
1267
+ "Text _O": Text_O,
1268
+ "seed _O": seed_O,
1269
+ "int _O": int_O,
1270
+ "float _O": float_O,
1271
+ }
src/QualityOfLife_deprecatedNodes.py ADDED
@@ -0,0 +1,543 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Developed by Omar - https://github.com/omar92
2
+ # https://civitai.com/user/omar92
3
+ # discord: Omar92#3374
4
+
5
+ ###
6
+ #
7
+ # All nodes in this file are deprecated and will be removed in the future , i left them here for backward compatibility
8
+ #
9
+ ###
10
+
11
+ import io
12
+ import os
13
+ import time
14
+ import numpy as np
15
+ import requests
16
+ import torch
17
+ from PIL import Image, ImageFont, ImageDraw
18
+ from PIL import Image, ImageDraw
19
+ import importlib
20
+ import comfy.samplers
21
+ import comfy.sd
22
+ import comfy.utils
23
+
24
+ # -------------------------------------------------------------------------------------------
25
+ # region deprecated
26
+ # region openAITools
27
+
28
+
29
+ class O_ChatGPT_deprecated:
30
+ """
31
+ this node is based on the openAI GPT-3 API to generate propmpts using the AI
32
+ """
33
+ # Define the input types for the node
34
+ @classmethod
35
+ def INPUT_TYPES(cls):
36
+ return {
37
+ "required": {
38
+ # Multiline string input for the prompt
39
+ "prompt": ("STRING", {"multiline": True}),
40
+ # File input for the API key
41
+ "api_key_file": ("STRING", {"file": True, "default": "api_key.txt"})
42
+ }
43
+ }
44
+
45
+ RETURN_TYPES = ("STR",) # Define the return type of the node
46
+ FUNCTION = "fun" # Define the function name for the node
47
+ CATEGORY = "O >>/deprecated >>/OpenAI >>" # Define the category for the node
48
+
49
+ def fun(self, api_key_file, prompt):
50
+ self.install_openai() # Install the OpenAI module if not already installed
51
+ import openai # Import the OpenAI module
52
+
53
+ # Get the API key from the file
54
+ api_key = self.get_api_key(api_key_file)
55
+
56
+ openai.api_key = api_key # Set the API key for the OpenAI module
57
+
58
+ # Create a chat completion using the OpenAI module
59
+ completion = openai.ChatCompletion.create(
60
+ model="gpt-3.5-turbo",
61
+ messages=[
62
+ {"role": "user", "content": "act as prompt generator ,i will give you text and you describe an image that match that text in details, answer with one response only"},
63
+ {"role": "user", "content": prompt}
64
+ ]
65
+ )
66
+ # Get the answer from the chat completion
67
+ answer = completion["choices"][0]["message"]["content"]
68
+
69
+ return (
70
+ {
71
+ "string": answer, # Return the answer as a string
72
+ },
73
+ )
74
+
75
+ # Helper function to get the API key from the file
76
+ def get_api_key(self, api_key_file):
77
+ custom_nodes_dir = 'ComfyUI/custom_nodes/' # Define the directory for the file
78
+ with open(custom_nodes_dir+api_key_file, 'r') as f: # Open the file and read the API key
79
+ api_key = f.read().strip()
80
+ return api_key # Return the API key
81
+
82
+ # Helper function to install the OpenAI module if not already installed
83
+ def install_openai(self):
84
+ try:
85
+ importlib.import_module('openai')
86
+ except ImportError:
87
+ import pip
88
+ pip.main(['install', 'openai'])
89
+ # region advanced
90
+
91
+ """
92
+ this node will load openAI model
93
+ """
94
+ # Define the input types for the node
95
+ @classmethod
96
+ def INPUT_TYPES(cls):
97
+ return {
98
+ "required": {
99
+ # File input for the API key
100
+ "api_key_file": ("STRING", {"file": True, "default": "api_key.txt"})
101
+ }
102
+ }
103
+ RETURN_TYPES = ("OPENAI",) # Define the return type of the node
104
+ FUNCTION = "fun" # Define the function name for the node
105
+ CATEGORY = "O >>/OpenAI >>/Advanced >>" # Define the category for the node
106
+
107
+ def fun(self, api_key_file):
108
+ self.install_openai() # Install the OpenAI module if not already installed
109
+ import openai # Import the OpenAI module
110
+
111
+ # Get the API key from the file
112
+ api_key = self.get_api_key(api_key_file)
113
+ openai.api_key = api_key # Set the API key for the OpenAI module
114
+
115
+ return (
116
+ {
117
+ "openai": openai, # Return openAI model
118
+ },
119
+ )
120
+
121
+ # Helper function to install the OpenAI module if not already installed
122
+ def install_openai(self):
123
+ try:
124
+ importlib.import_module('openai')
125
+ except ImportError:
126
+ import pip
127
+ pip.main(['install', 'openai'])
128
+
129
+ # Helper function to get the API key from the file
130
+ def get_api_key(self, api_key_file):
131
+ custom_nodes_dir = 'ComfyUI/custom_nodes/' # Define the directory for the file
132
+ with open(custom_nodes_dir+api_key_file, 'r') as f: # Open the file and read the API key
133
+ api_key = f.read().strip()
134
+ return api_key # Return the API key
135
+ # region ChatGPT
136
+
137
+
138
+ class openAi_chat_message_STR_deprecated:
139
+ """
140
+ create chat message for openAI chatGPT
141
+ """
142
+ # Define the input types for the node
143
+ @classmethod
144
+ def INPUT_TYPES(cls):
145
+ return {
146
+ "required": {
147
+ "role": (["user", "assistant", "system"], {"default": "user"}),
148
+ "content": ("STR",),
149
+ }
150
+ }
151
+ # Define the return type of the node
152
+ RETURN_TYPES = ("OPENAI_CHAT_MESSAGES",)
153
+ FUNCTION = "fun" # Define the function name for the node
154
+ # Define the category for the node
155
+ CATEGORY = "O >>/deprecated >>/OpenAI >>/Advanced >>/ChatGPT"
156
+
157
+ def fun(self, role, content):
158
+ return (
159
+ {
160
+ "messages": [{"role": role, "content": content["string"], }]
161
+ },
162
+ )
163
+ # endregion ChatGPT
164
+ # region Image
165
+
166
+ class openAi_chat_messages_Combine_deprecated:
167
+ """
168
+ compine chat messages into 1 tuple
169
+ """
170
+ # Define the input types for the node
171
+ @classmethod
172
+ def INPUT_TYPES(cls):
173
+ return {
174
+ "required": {
175
+ "message1": ("OPENAI_CHAT_MESSAGES", ),
176
+ "message2": ("OPENAI_CHAT_MESSAGES", ),
177
+ }
178
+ }
179
+ # Define the return type of the node
180
+ RETURN_TYPES = ("OPENAI_CHAT_MESSAGES",)
181
+ FUNCTION = "fun" # Define the function name for the node
182
+ # Define the category for the node
183
+ CATEGORY = "O >>/deprecated >>/OpenAI >>/Advanced >>/ChatGPT >>"
184
+
185
+ def fun(self, message1, message2):
186
+ messages = message1["messages"] + \
187
+ message2["messages"] # compine messages
188
+
189
+ return (
190
+ {
191
+ "messages": messages
192
+ },
193
+ )
194
+ class openAi_Image_create_deprecated:
195
+ """
196
+ create image using openai
197
+ """
198
+ # Define the input types for the node
199
+ @classmethod
200
+ def INPUT_TYPES(cls):
201
+ return {
202
+ "required": {
203
+ "openai": ("OPENAI", ),
204
+ "prompt": ("STR",),
205
+ "number": ("INT", {"default": 1, "min": 1, "max": 10, "step": 1}),
206
+ "size": (["256x256", "512x512", "1024x1024"], {"default": "256x256"}),
207
+ }
208
+ }
209
+ # Define the return type of the node
210
+ RETURN_TYPES = ("IMAGE", "MASK")
211
+ FUNCTION = "fun" # Define the function name for the node
212
+ OUTPUT_NODE = True
213
+ # Define the category for the node
214
+ CATEGORY = "O >>/deprecated >>/OpenAI >>/Advanced >>/Image"
215
+
216
+ def fun(self, openai, prompt, number, size):
217
+ # Create a chat completion using the OpenAI module
218
+ openai = openai["openai"]
219
+ prompt = prompt["string"]
220
+ number = 1
221
+ imagesURLS = openai.Image.create(
222
+ prompt=prompt,
223
+ n=number,
224
+ size=size
225
+ )
226
+ imageURL = imagesURLS["data"][0]["url"]
227
+ print("imageURL:", imageURL)
228
+ image = requests.get(imageURL).content
229
+ i = Image.open(io.BytesIO(image))
230
+ image = i.convert("RGBA")
231
+ image = np.array(image).astype(np.float32) / 255.0
232
+ # image_np = np.transpose(image_np, (2, 0, 1))
233
+ image = torch.from_numpy(image)[None,]
234
+ if 'A' in i.getbands():
235
+ mask = np.array(i.getchannel('A')).astype(np.float32) / 255.0
236
+ mask = 1. - torch.from_numpy(mask)
237
+ else:
238
+ mask = torch.zeros((64, 64), dtype=torch.float32, device="cpu")
239
+ print("image_tensor: done")
240
+ return (image, mask)
241
+
242
+
243
+ class openAi_chat_completion_deprecated:
244
+ """
245
+ create chat completion for openAI chatGPT
246
+ """
247
+ # Define the input types for the node
248
+ @classmethod
249
+ def INPUT_TYPES(cls):
250
+ return {
251
+ "required": {
252
+ "openai": ("OPENAI", ),
253
+ "model": ("STRING", {"multiline": False, "default": "gpt-3.5-turbo"}),
254
+ "messages": ("OPENAI_CHAT_MESSAGES", ),
255
+ }
256
+ }
257
+ # Define the return type of the node
258
+ RETURN_TYPES = ("STR", "OPENAI_CHAT_COMPLETION",)
259
+ FUNCTION = "fun" # Define the function name for the node
260
+ OUTPUT_NODE = True
261
+ # Define the category for the node
262
+ CATEGORY = "O >>/deprecated >>/OpenAI >>/Advanced >>/ChatGPT"
263
+
264
+ def fun(self, openai, model, messages):
265
+ # Create a chat completion using the OpenAI module
266
+ openai = openai["openai"]
267
+ completion = openai.ChatCompletion.create(
268
+ model=model,
269
+ messages=messages["messages"]
270
+ )
271
+ # Get the answer from the chat completion
272
+ content = completion["choices"][0]["message"]["content"]
273
+ return (
274
+ {
275
+ "string": content, # Return the answer as a string
276
+ },
277
+ {
278
+ "completion": completion, # Return the chat completion
279
+ }
280
+ )
281
+
282
+
283
+ # endregion Image
284
+ # endregion advanced
285
+ # endregion openAI
286
+ # region StringTools
287
+
288
+
289
+ class O_String_deprecated:
290
+ """
291
+ this node is a simple string node that can be used to hold userinput as string
292
+ """
293
+ @classmethod
294
+ def INPUT_TYPES(cls):
295
+ return {"required": {"string": ("STRING", {"multiline": True})}}
296
+
297
+ RETURN_TYPES = ("STR",)
298
+ FUNCTION = "ostr"
299
+ CATEGORY = "O >>/deprecated >>/string >>"
300
+
301
+ @staticmethod
302
+ def ostr(string):
303
+ return ({"string": string},)
304
+
305
+
306
+ class DebugString_deprecated:
307
+ """
308
+ This node will write a string to the console
309
+ """
310
+ @classmethod
311
+ def INPUT_TYPES(cls):
312
+ return {"required": {"string": ("STR",)}}
313
+
314
+ RETURN_TYPES = ()
315
+ FUNCTION = "debug_string"
316
+ OUTPUT_NODE = True
317
+ CATEGORY = "O >>/deprecated >>/string >>"
318
+
319
+ @staticmethod
320
+ def debug_string(string):
321
+ print("debugString:", string["string"])
322
+ return ()
323
+
324
+
325
+ class string2Image_deprecated:
326
+ """
327
+ This node will convert a string to an image
328
+ """
329
+
330
+ def __init__(self):
331
+ self.font_filepath = os.path.join(
332
+ os.path.dirname(os.path.realpath(__file__)), "fonts")
333
+
334
+ @classmethod
335
+ def INPUT_TYPES(s):
336
+ return {
337
+ "required": {
338
+ "string": ("STR",),
339
+ "font": ("STRING", {"default": "CALIBRI.TTF", "multiline": False}),
340
+ "size": ("INT", {"default": 36, "min": 0, "max": 255, "step": 1}),
341
+ "font_R": ("INT", {"default": 0, "min": 0, "max": 255, "step": 1}),
342
+ "font_G": ("INT", {"default": 0, "min": 0, "max": 255, "step": 1}),
343
+ "font_B": ("INT", {"default": 0, "min": 0, "max": 255, "step": 1}),
344
+ "background_R": ("INT", {"default": 255, "min": 0, "max": 255, "step": 1}),
345
+ "background_G": ("INT", {"default": 255, "min": 0, "max": 255, "step": 1}),
346
+ "background_B": ("INT", {"default": 255, "min": 0, "max": 255, "step": 1}),
347
+ }
348
+ }
349
+
350
+ RETURN_TYPES = ("IMAGE",)
351
+ FUNCTION = "create_image"
352
+ OUTPUT_NODE = False
353
+ CATEGORY = "O >>/deprecated >>/string >>"
354
+
355
+ def create_image(self, string, font, size, font_R, font_G, font_B, background_R, background_G, background_B):
356
+ font_color = (font_R, font_G, font_B)
357
+ font = ImageFont.truetype(self.font_filepath+"\\"+font, size)
358
+ mask_image = font.getmask(string["string"], "L")
359
+ image = Image.new("RGBA", mask_image.size,
360
+ (background_R, background_G, background_B))
361
+ # need to use the inner `img.im.paste` due to `getmask` returning a core
362
+ image.im.paste(font_color, (0, 0) + mask_image.size, mask_image)
363
+
364
+ # Convert the PIL Image to a tensor
365
+ image_np = np.array(image).astype(np.float32) / 255.0
366
+ image_tensor = torch.from_numpy(image_np).unsqueeze(0)
367
+ return (image_tensor,)
368
+
369
+
370
+ class CLIPStringEncode_deprecated:
371
+ """
372
+ This node will encode a string with CLIP
373
+ """
374
+ @classmethod
375
+ def INPUT_TYPES(s):
376
+ return {"required": {
377
+ "string": ("STR",),
378
+ "clip": ("CLIP", )
379
+ }}
380
+ RETURN_TYPES = ("CONDITIONING",)
381
+ FUNCTION = "encode"
382
+
383
+ CATEGORY = "O >>/deprecated >>/string >>"
384
+
385
+ def encode(self, string, clip):
386
+ return ([[clip.encode(string["string"]), {}]], )
387
+ # region String/operations
388
+
389
+
390
+ class concat_String_deprecated:
391
+ """
392
+ This node will concatenate two strings together
393
+ """
394
+ @classmethod
395
+ def INPUT_TYPES(cls):
396
+ return {"required": {
397
+ "string1": ("STR",),
398
+ "string2": ("STR",)
399
+ }}
400
+
401
+ RETURN_TYPES = ("STR",)
402
+ FUNCTION = "fun"
403
+ CATEGORY = "O >>/deprecated >>/string >>/operations >>"
404
+
405
+ @staticmethod
406
+ def fun(string1, string2):
407
+ return ({"string": string1["string"] + string2["string"]},)
408
+
409
+
410
+ class trim_String_deprecated:
411
+ """
412
+ This node will trim a string from the left and right
413
+ """
414
+ @classmethod
415
+ def INPUT_TYPES(cls):
416
+ return {"required": {
417
+ "string": ("STR",),
418
+ }}
419
+
420
+ RETURN_TYPES = ("STR",)
421
+ FUNCTION = "fun"
422
+ CATEGORY = "O >>/deprecated >>/string >>/operations >>"
423
+
424
+ def fun(self, string):
425
+ return (
426
+ {
427
+ "string": (string["string"].strip()),
428
+ },
429
+ )
430
+
431
+
432
+ class replace_String_deprecated:
433
+ """
434
+ This node will replace a string with another string
435
+ """
436
+ @classmethod
437
+ def INPUT_TYPES(cls):
438
+ return {"required": {
439
+ "string": ("STR",),
440
+ "old": ("STRING", {"multiline": False}),
441
+ "new": ("STRING", {"multiline": False})
442
+ }}
443
+
444
+ RETURN_TYPES = ("STR",)
445
+ FUNCTION = "fun"
446
+ CATEGORY = "O >>/deprecated >>/string >>/operations >>"
447
+
448
+ @staticmethod
449
+ def fun(string, old, new):
450
+ return ({"string": string["string"].replace(old, new)},)
451
+
452
+ # replace a string with another string
453
+
454
+
455
+ class replace_String_advanced_deprecated:
456
+ """
457
+ This node will replace a string with another string
458
+ """
459
+ @classmethod
460
+ def INPUT_TYPES(cls):
461
+ return {"required": {
462
+ "string": ("STR",),
463
+ "old": ("STR",),
464
+ "new": ("STR",),
465
+ }}
466
+
467
+ RETURN_TYPES = ("STR",)
468
+ FUNCTION = "fun"
469
+ CATEGORY = "O >>/deprecated >>/string >>/operations >>"
470
+
471
+ @staticmethod
472
+ def fun(string, old, new):
473
+ return ({"string": string["string"].replace(old["string"], new["string"])},)
474
+ # endregion
475
+ # endregion
476
+
477
+
478
+ class LatentUpscaleMultiply_deprecated:
479
+ """
480
+ Upscale the latent code by multiplying the width and height by a factor
481
+ """
482
+ upscale_methods = ["nearest-exact", "bilinear", "area"]
483
+ crop_methods = ["disabled", "center"]
484
+
485
+ @classmethod
486
+ def INPUT_TYPES(cls):
487
+ return {
488
+ "required": {
489
+ "samples": ("LATENT",),
490
+ "upscale_method": (cls.upscale_methods,),
491
+ "WidthMul": ("FLOAT", {"default": 1.25, "min": 0.0, "max": 10.0, "step": 0.1}),
492
+ "HeightMul": ("FLOAT", {"default": 1.25, "min": 0.0, "max": 10.0, "step": 0.1}),
493
+ "crop": (cls.crop_methods,),
494
+ }
495
+ }
496
+
497
+ RETURN_TYPES = ("LATENT",)
498
+ FUNCTION = "upscale"
499
+ CATEGORY = "O >>/deprecated >>/latent >>"
500
+
501
+ def upscale(self, samples, upscale_method, WidthMul, HeightMul, crop):
502
+ s = samples.copy()
503
+ x = samples["samples"].shape[3]
504
+ y = samples["samples"].shape[2]
505
+
506
+ new_x = int(x * WidthMul)
507
+ new_y = int(y * HeightMul)
508
+ print(f"upscale from ({x*8},{y*8}) to ({new_x*8},{new_y*8})")
509
+
510
+ def enforce_mul_of_64(d):
511
+ leftover = d % 8
512
+ if leftover != 0:
513
+ d += 8 - leftover
514
+ return d
515
+
516
+ s["samples"] = comfy.utils.common_upscale(
517
+ samples["samples"], enforce_mul_of_64(
518
+ new_x), enforce_mul_of_64(new_y), upscale_method, crop
519
+ )
520
+ return (s,)
521
+
522
+ # endregion deprecated
523
+
524
+
525
+ # Define the node class mappings
526
+ NODE_CLASS_MAPPINGS = {
527
+ # deprecated
528
+ "ChatGPT _O": O_ChatGPT_deprecated,
529
+ "Chat_Message_fromString _O": openAi_chat_message_STR_deprecated,
530
+ "compine_chat_messages _O": openAi_chat_messages_Combine_deprecated,
531
+ "Chat_Completion _O": openAi_chat_completion_deprecated,
532
+ "create_image _O": openAi_Image_create_deprecated,
533
+ "String _O": O_String_deprecated,
534
+ "Debug String _O": DebugString_deprecated,
535
+ "concat Strings _O": concat_String_deprecated,
536
+ "trim String _O": trim_String_deprecated,
537
+ "replace String _O": replace_String_deprecated,
538
+ "replace String advanced _O": replace_String_advanced_deprecated,
539
+ "string2Image _O": string2Image_deprecated,
540
+ "CLIPStringEncode _O": CLIPStringEncode_deprecated,
541
+ "CLIPStringEncode _O": CLIPStringEncode_deprecated,
542
+ "LatentUpscaleMultiply": LatentUpscaleMultiply_deprecated,
543
+ }
src/__pycache__/QualityOfLifeSuit_Omar92.cpython-310.pyc ADDED
Binary file (29.1 kB). View file
 
src/__pycache__/QualityOfLife_deprecatedNodes.cpython-310.pyc ADDED
Binary file (13.5 kB). View file
 
update/__pycache__/update.cpython-310.pyc ADDED
Binary file (2.27 kB). View file
 
update/update.py ADDED
@@ -0,0 +1,77 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import importlib
2
+ from datetime import datetime
3
+
4
+ def pull(pygit2,repo, remote_name='origin', branch='master'):
5
+ for remote in repo.remotes:
6
+ #print("fetching latest changes: ",remote.name)
7
+ if remote.name == remote_name:
8
+ remote.fetch()
9
+ remote_master_id = repo.lookup_reference('refs/remotes/origin/%s' % (branch)).target
10
+ merge_result, _ = repo.merge_analysis(remote_master_id)
11
+ # Up to date, do nothing
12
+ if merge_result & pygit2.GIT_MERGE_ANALYSIS_UP_TO_DATE:
13
+ return
14
+ # We can just fastforward
15
+ elif merge_result & pygit2.GIT_MERGE_ANALYSIS_FASTFORWARD:
16
+ repo.checkout_tree(repo.get(remote_master_id))
17
+ try:
18
+ master_ref = repo.lookup_reference('refs/heads/%s' % (branch))
19
+ master_ref.set_target(remote_master_id)
20
+ except KeyError:
21
+ repo.create_branch(branch, repo.get(remote_master_id))
22
+ repo.head.set_target(remote_master_id)
23
+ elif merge_result & pygit2.GIT_MERGE_ANALYSIS_NORMAL:
24
+ repo.merge(remote_master_id)
25
+
26
+ if repo.index.conflicts is not None:
27
+ for conflict in repo.index.conflicts:
28
+ print('Conflicts found in:', conflict[0].path)
29
+ raise AssertionError('Conflicts, ahhhhh!!')
30
+
31
+ user = repo.default_signature
32
+ tree = repo.index.write_tree()
33
+ commit = repo.create_commit('HEAD',
34
+ user,
35
+ user,
36
+ 'Merge!',
37
+ tree,
38
+ [repo.head.target, remote_master_id])
39
+ # We need to do this or git CLI will think we are still merging.
40
+ repo.state_cleanup()
41
+ else:
42
+ raise AssertionError('Unknown merge analysis result')
43
+ def install_pygit2():
44
+ # Helper function to install the pygit2 module if not already installed
45
+ try:
46
+ importlib.import_module('pygit2')
47
+ except ImportError:
48
+ import pip
49
+ pip.main(['install', 'pygit2'])
50
+ def update(repoPath = "", branch_name="main" ):
51
+ print(f"Updating: Quality of Life Suit...")
52
+
53
+ install_pygit2()
54
+ import pygit2
55
+
56
+ repo = pygit2.Repository(repoPath)
57
+ ident = pygit2.Signature('omar92', 'omar@92')
58
+ try:
59
+ #print("stashing current changes")
60
+ repo.stash(ident)
61
+ except KeyError:
62
+ #print("nothing to stash")
63
+ pass
64
+ backup_branch_name = 'backup_branch_{}'.format(datetime.today().strftime('%Y-%m-%d_%H_%M_%S'))
65
+ #print("creating backup branch: {}".format(backup_branch_name))
66
+ repo.branches.local.create(backup_branch_name, repo.head.peel())
67
+
68
+ #print(f"checking out {branch_name} branch")
69
+ branch = repo.lookup_branch(str(branch_name))
70
+ ref = repo.lookup_reference(branch.name)
71
+ repo.checkout(ref)
72
+
73
+ #print("pulling latest changes")
74
+ pull(pygit2,repo, branch=branch_name)
75
+
76
+ print(f"done: Quality of Life Suit, updated successfully...")
77
+
update/update_QualityOfLifeSuit.bat ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ python.exe .\update.py ..
2
+ pause
v2.1 Workflow _ load to check whats new.json ADDED
@@ -0,0 +1,3257 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "last_node_id": 84,
3
+ "last_link_id": 78,
4
+ "nodes": [
5
+ {
6
+ "id": 2,
7
+ "type": "Equation1param _O",
8
+ "pos": [
9
+ 70.26363461036739,
10
+ 125.46773995871277
11
+ ],
12
+ "size": {
13
+ "0": 400,
14
+ "1": 200
15
+ },
16
+ "flags": {},
17
+ "order": 0,
18
+ "mode": 0,
19
+ "outputs": [
20
+ {
21
+ "name": "FLOAT",
22
+ "type": "FLOAT",
23
+ "links": [
24
+ 7
25
+ ],
26
+ "slot_index": 0
27
+ }
28
+ ],
29
+ "properties": {
30
+ "Node name for S&R": "Equation1param _O"
31
+ },
32
+ "widgets_values": [
33
+ 6.5,
34
+ "x+1/3"
35
+ ]
36
+ },
37
+ {
38
+ "id": 5,
39
+ "type": "Equation2params _O",
40
+ "pos": [
41
+ 81.2636346103674,
42
+ 394.4677399587126
43
+ ],
44
+ "size": {
45
+ "0": 400,
46
+ "1": 200
47
+ },
48
+ "flags": {},
49
+ "order": 35,
50
+ "mode": 0,
51
+ "inputs": [
52
+ {
53
+ "name": "x",
54
+ "type": "FLOAT",
55
+ "link": 7,
56
+ "widget": {
57
+ "name": "x",
58
+ "config": [
59
+ "FLOAT",
60
+ {
61
+ "default": 0,
62
+ "min": 0,
63
+ "max": 18446744073709552000
64
+ }
65
+ ]
66
+ }
67
+ }
68
+ ],
69
+ "outputs": [
70
+ {
71
+ "name": "FLOAT",
72
+ "type": "FLOAT",
73
+ "links": [
74
+ 4
75
+ ],
76
+ "slot_index": 0
77
+ }
78
+ ],
79
+ "properties": {
80
+ "Node name for S&R": "Equation2params _O"
81
+ },
82
+ "widgets_values": [
83
+ 5,
84
+ 2.5,
85
+ "x+y"
86
+ ]
87
+ },
88
+ {
89
+ "id": 6,
90
+ "type": "floatToInt _O",
91
+ "pos": [
92
+ 551.2636346103676,
93
+ 134.4677399587128
94
+ ],
95
+ "size": {
96
+ "0": 315,
97
+ "1": 58
98
+ },
99
+ "flags": {},
100
+ "order": 43,
101
+ "mode": 0,
102
+ "inputs": [
103
+ {
104
+ "name": "float",
105
+ "type": "FLOAT",
106
+ "link": 4,
107
+ "widget": {
108
+ "name": "float",
109
+ "config": [
110
+ "FLOAT",
111
+ {
112
+ "default": 0,
113
+ "min": 0,
114
+ "max": 18446744073709552000
115
+ }
116
+ ]
117
+ }
118
+ }
119
+ ],
120
+ "outputs": [
121
+ {
122
+ "name": "INT",
123
+ "type": "INT",
124
+ "links": [
125
+ 5
126
+ ],
127
+ "slot_index": 0
128
+ }
129
+ ],
130
+ "properties": {
131
+ "Node name for S&R": "floatToInt _O"
132
+ },
133
+ "widgets_values": [
134
+ 0
135
+ ]
136
+ },
137
+ {
138
+ "id": 7,
139
+ "type": "intToFloat _O",
140
+ "pos": [
141
+ 571.2636346103676,
142
+ 322.4677399587126
143
+ ],
144
+ "size": {
145
+ "0": 315,
146
+ "1": 58
147
+ },
148
+ "flags": {},
149
+ "order": 49,
150
+ "mode": 0,
151
+ "inputs": [
152
+ {
153
+ "name": "int",
154
+ "type": "INT",
155
+ "link": 5,
156
+ "widget": {
157
+ "name": "int",
158
+ "config": [
159
+ "INT",
160
+ {
161
+ "default": 0,
162
+ "min": 0,
163
+ "max": 18446744073709552000
164
+ }
165
+ ]
166
+ }
167
+ }
168
+ ],
169
+ "outputs": [
170
+ {
171
+ "name": "FLOAT",
172
+ "type": "FLOAT",
173
+ "links": [
174
+ 6
175
+ ],
176
+ "slot_index": 0
177
+ }
178
+ ],
179
+ "properties": {
180
+ "Node name for S&R": "intToFloat _O"
181
+ },
182
+ "widgets_values": [
183
+ 0
184
+ ]
185
+ },
186
+ {
187
+ "id": 3,
188
+ "type": "floatToText _O",
189
+ "pos": [
190
+ 558.2636346103676,
191
+ 478.4677399587126
192
+ ],
193
+ "size": {
194
+ "0": 315,
195
+ "1": 58
196
+ },
197
+ "flags": {},
198
+ "order": 57,
199
+ "mode": 0,
200
+ "inputs": [
201
+ {
202
+ "name": "float",
203
+ "type": "FLOAT",
204
+ "link": 6,
205
+ "widget": {
206
+ "name": "float",
207
+ "config": [
208
+ "FLOAT",
209
+ {
210
+ "default": 0,
211
+ "min": 0,
212
+ "max": 18446744073709552000
213
+ }
214
+ ]
215
+ }
216
+ }
217
+ ],
218
+ "outputs": [
219
+ {
220
+ "name": "STRING",
221
+ "type": "STRING",
222
+ "links": [
223
+ 2
224
+ ],
225
+ "slot_index": 0
226
+ }
227
+ ],
228
+ "properties": {
229
+ "Node name for S&R": "floatToText _O"
230
+ },
231
+ "widgets_values": [
232
+ 0
233
+ ]
234
+ },
235
+ {
236
+ "id": 4,
237
+ "type": "Debug Text _O",
238
+ "pos": [
239
+ 945.2636346103674,
240
+ 353.4677399587126
241
+ ],
242
+ "size": {
243
+ "0": 400,
244
+ "1": 200
245
+ },
246
+ "flags": {
247
+ "collapsed": true
248
+ },
249
+ "order": 66,
250
+ "mode": 0,
251
+ "inputs": [
252
+ {
253
+ "name": "text",
254
+ "type": "STRING",
255
+ "link": 2,
256
+ "widget": {
257
+ "name": "text",
258
+ "config": [
259
+ "STRING",
260
+ {
261
+ "multiline": true
262
+ }
263
+ ]
264
+ }
265
+ }
266
+ ],
267
+ "properties": {
268
+ "Node name for S&R": "Debug Text _O"
269
+ },
270
+ "widgets_values": [
271
+ "",
272
+ "debug"
273
+ ]
274
+ },
275
+ {
276
+ "id": 12,
277
+ "type": "Debug Text _O",
278
+ "pos": [
279
+ 1053.5936369564054,
280
+ 815.5227515553921
281
+ ],
282
+ "size": {
283
+ "0": 400,
284
+ "1": 200
285
+ },
286
+ "flags": {
287
+ "collapsed": true
288
+ },
289
+ "order": 38,
290
+ "mode": 0,
291
+ "inputs": [
292
+ {
293
+ "name": "text",
294
+ "type": "STRING",
295
+ "link": 8,
296
+ "widget": {
297
+ "name": "text",
298
+ "config": [
299
+ "STRING",
300
+ {
301
+ "multiline": true
302
+ }
303
+ ]
304
+ }
305
+ }
306
+ ],
307
+ "properties": {
308
+ "Node name for S&R": "Debug Text _O"
309
+ },
310
+ "widgets_values": [
311
+ "",
312
+ "debug"
313
+ ]
314
+ },
315
+ {
316
+ "id": 14,
317
+ "type": "Chat_Message _O",
318
+ "pos": [
319
+ 486.5936369564066,
320
+ 1281.5227515553931
321
+ ],
322
+ "size": {
323
+ "0": 389.8748779296875,
324
+ "1": 128.49383544921875
325
+ },
326
+ "flags": {},
327
+ "order": 1,
328
+ "mode": 0,
329
+ "outputs": [
330
+ {
331
+ "name": "OPENAI_CHAT_MESSAGES",
332
+ "type": "OPENAI_CHAT_MESSAGES",
333
+ "links": [
334
+ 9
335
+ ],
336
+ "slot_index": 0
337
+ }
338
+ ],
339
+ "title": "Chat_Message _O (the init message)",
340
+ "properties": {
341
+ "Node name for S&R": "Chat_Message _O"
342
+ },
343
+ "widgets_values": [
344
+ "user",
345
+ "act as prompt generator ,i will give you text and you describe an image that match that text in details, answer with one response only"
346
+ ]
347
+ },
348
+ {
349
+ "id": 16,
350
+ "type": "combine_chat_messages _O",
351
+ "pos": [
352
+ 965.5936369564066,
353
+ 1282.5227515553931
354
+ ],
355
+ "size": {
356
+ "0": 367.79998779296875,
357
+ "1": 46
358
+ },
359
+ "flags": {},
360
+ "order": 37,
361
+ "mode": 0,
362
+ "inputs": [
363
+ {
364
+ "name": "message1",
365
+ "type": "OPENAI_CHAT_MESSAGES",
366
+ "link": 9
367
+ },
368
+ {
369
+ "name": "message2",
370
+ "type": "OPENAI_CHAT_MESSAGES",
371
+ "link": 10
372
+ }
373
+ ],
374
+ "outputs": [
375
+ {
376
+ "name": "OPENAI_CHAT_MESSAGES",
377
+ "type": "OPENAI_CHAT_MESSAGES",
378
+ "links": [
379
+ 12
380
+ ],
381
+ "slot_index": 0
382
+ }
383
+ ],
384
+ "properties": {
385
+ "Node name for S&R": "combine_chat_messages _O"
386
+ }
387
+ },
388
+ {
389
+ "id": 19,
390
+ "type": "Reroute",
391
+ "pos": [
392
+ 1311.5936369564051,
393
+ 1351.5227515553931
394
+ ],
395
+ "size": [
396
+ 75,
397
+ 26
398
+ ],
399
+ "flags": {},
400
+ "order": 45,
401
+ "mode": 0,
402
+ "inputs": [
403
+ {
404
+ "name": "",
405
+ "type": "*",
406
+ "link": 12,
407
+ "pos": [
408
+ 37.5,
409
+ 0
410
+ ]
411
+ }
412
+ ],
413
+ "outputs": [
414
+ {
415
+ "name": "",
416
+ "type": "OPENAI_CHAT_MESSAGES",
417
+ "links": [
418
+ 14
419
+ ],
420
+ "slot_index": 0
421
+ }
422
+ ],
423
+ "properties": {
424
+ "showOutputText": false,
425
+ "horizontal": true
426
+ }
427
+ },
428
+ {
429
+ "id": 11,
430
+ "type": "Note _O",
431
+ "pos": [
432
+ 25.593636956406257,
433
+ 1237.5227515553931
434
+ ],
435
+ "size": {
436
+ "0": 400,
437
+ "1": 200
438
+ },
439
+ "flags": {},
440
+ "order": 2,
441
+ "mode": 0,
442
+ "properties": {
443
+ "Node name for S&R": "Note _O"
444
+ },
445
+ "widgets_values": [
446
+ "updates\n - no longer need to use String nodes, so this new one is using \n normal text parameter, so it is now compatible with other comfyUI \n nodes that receive text\n\n - support selecting the model in chat Completion node\n so if you have access to gpt-4 you can use it\n\n//note: using ChatGPT with revAnimated or mistoonAnime checkpoints produce stunning accurate results"
447
+ ],
448
+ "color": "#432",
449
+ "bgcolor": "#653"
450
+ },
451
+ {
452
+ "id": 22,
453
+ "type": "Note _O",
454
+ "pos": [
455
+ -443.40636304359373,
456
+ 731.5227515553921
457
+ ],
458
+ "size": {
459
+ "0": 400,
460
+ "1": 200
461
+ },
462
+ "flags": {},
463
+ "order": 3,
464
+ "mode": 0,
465
+ "properties": {
466
+ "Node name for S&R": "Note _O"
467
+ },
468
+ "widgets_values": [
469
+ "Open AI package"
470
+ ],
471
+ "color": "#432",
472
+ "bgcolor": "#653"
473
+ },
474
+ {
475
+ "id": 13,
476
+ "type": "load_openAI _O",
477
+ "pos": [
478
+ 462.59363695640627,
479
+ 1093.5227515553931
480
+ ],
481
+ "size": {
482
+ "0": 315,
483
+ "1": 58
484
+ },
485
+ "flags": {},
486
+ "order": 4,
487
+ "mode": 0,
488
+ "outputs": [
489
+ {
490
+ "name": "OPENAI",
491
+ "type": "OPENAI",
492
+ "links": [
493
+ 18
494
+ ],
495
+ "slot_index": 0
496
+ }
497
+ ],
498
+ "properties": {
499
+ "Node name for S&R": "load_openAI _O"
500
+ },
501
+ "widgets_values": [
502
+ "api_key.txt"
503
+ ]
504
+ },
505
+ {
506
+ "id": 20,
507
+ "type": "Reroute",
508
+ "pos": [
509
+ 922.5936369564066,
510
+ 1404.5227515553931
511
+ ],
512
+ "size": [
513
+ 75,
514
+ 26
515
+ ],
516
+ "flags": {},
517
+ "order": 52,
518
+ "mode": 0,
519
+ "inputs": [
520
+ {
521
+ "name": "",
522
+ "type": "*",
523
+ "link": 14,
524
+ "pos": [
525
+ 37.5,
526
+ 0
527
+ ]
528
+ }
529
+ ],
530
+ "outputs": [
531
+ {
532
+ "name": "",
533
+ "type": "OPENAI_CHAT_MESSAGES",
534
+ "links": [
535
+ 15
536
+ ],
537
+ "slot_index": 0
538
+ }
539
+ ],
540
+ "properties": {
541
+ "showOutputText": false,
542
+ "horizontal": true
543
+ }
544
+ },
545
+ {
546
+ "id": 15,
547
+ "type": "Chat_Message _O",
548
+ "pos": [
549
+ 490.5936369564066,
550
+ 1450.5227515553931
551
+ ],
552
+ "size": {
553
+ "0": 389.8748779296875,
554
+ "1": 128.49383544921875
555
+ },
556
+ "flags": {},
557
+ "order": 5,
558
+ "mode": 0,
559
+ "outputs": [
560
+ {
561
+ "name": "OPENAI_CHAT_MESSAGES",
562
+ "type": "OPENAI_CHAT_MESSAGES",
563
+ "links": [
564
+ 10
565
+ ],
566
+ "slot_index": 0
567
+ }
568
+ ],
569
+ "title": "Chat_Message _O (your prompt)",
570
+ "properties": {
571
+ "Node name for S&R": "Chat_Message _O"
572
+ },
573
+ "widgets_values": [
574
+ "user",
575
+ "dancng girl"
576
+ ]
577
+ },
578
+ {
579
+ "id": 24,
580
+ "type": "Reroute",
581
+ "pos": [
582
+ 854.5936369564066,
583
+ 1097.5227515553931
584
+ ],
585
+ "size": [
586
+ 90.4,
587
+ 26
588
+ ],
589
+ "flags": {},
590
+ "order": 36,
591
+ "mode": 0,
592
+ "inputs": [
593
+ {
594
+ "name": "",
595
+ "type": "*",
596
+ "link": 18,
597
+ "pos": [
598
+ 45.2,
599
+ 0
600
+ ]
601
+ }
602
+ ],
603
+ "outputs": [
604
+ {
605
+ "name": "OPENAI",
606
+ "type": "OPENAI",
607
+ "links": [
608
+ 19,
609
+ 23
610
+ ],
611
+ "slot_index": 0
612
+ }
613
+ ],
614
+ "properties": {
615
+ "showOutputText": true,
616
+ "horizontal": true
617
+ }
618
+ },
619
+ {
620
+ "id": 27,
621
+ "type": "Reroute",
622
+ "pos": [
623
+ 438.59363695640627,
624
+ 1719.5227515553931
625
+ ],
626
+ "size": [
627
+ 90.4,
628
+ 26
629
+ ],
630
+ "flags": {},
631
+ "order": 50,
632
+ "mode": 0,
633
+ "inputs": [
634
+ {
635
+ "name": "",
636
+ "type": "*",
637
+ "link": 24,
638
+ "pos": [
639
+ 45.2,
640
+ 0
641
+ ]
642
+ }
643
+ ],
644
+ "outputs": [
645
+ {
646
+ "name": "OPENAI",
647
+ "type": "OPENAI",
648
+ "links": [
649
+ 22
650
+ ],
651
+ "slot_index": 0
652
+ }
653
+ ],
654
+ "properties": {
655
+ "showOutputText": true,
656
+ "horizontal": true
657
+ }
658
+ },
659
+ {
660
+ "id": 29,
661
+ "type": "Note _O",
662
+ "pos": [
663
+ 29.593636956406243,
664
+ 1675.5227515553931
665
+ ],
666
+ "size": {
667
+ "0": 400,
668
+ "1": 200
669
+ },
670
+ "flags": {},
671
+ "order": 6,
672
+ "mode": 0,
673
+ "properties": {
674
+ "Node name for S&R": "Note _O"
675
+ },
676
+ "widgets_values": [
677
+ "updates\n - create image input now is text instead of string, so it can be\n Compatible with any other text nodes \n - also add fake seeds to force the node to generate new input each \n cycle if needed"
678
+ ],
679
+ "color": "#432",
680
+ "bgcolor": "#653"
681
+ },
682
+ {
683
+ "id": 28,
684
+ "type": "Reroute",
685
+ "pos": [
686
+ 863.5936369564066,
687
+ 1635.5227515553931
688
+ ],
689
+ "size": [
690
+ 90.4,
691
+ 26
692
+ ],
693
+ "flags": {},
694
+ "order": 44,
695
+ "mode": 0,
696
+ "inputs": [
697
+ {
698
+ "name": "",
699
+ "type": "*",
700
+ "link": 23,
701
+ "pos": [
702
+ 45.2,
703
+ 0
704
+ ]
705
+ }
706
+ ],
707
+ "outputs": [
708
+ {
709
+ "name": "OPENAI",
710
+ "type": "OPENAI",
711
+ "links": [
712
+ 24,
713
+ 28
714
+ ],
715
+ "slot_index": 0
716
+ }
717
+ ],
718
+ "properties": {
719
+ "showOutputText": true,
720
+ "horizontal": true
721
+ }
722
+ },
723
+ {
724
+ "id": 30,
725
+ "type": "PreviewImage",
726
+ "pos": [
727
+ 1056.5936369564051,
728
+ 1782.5227515553931
729
+ ],
730
+ "size": {
731
+ "0": 237.004638671875,
732
+ "1": 215.01806640625
733
+ },
734
+ "flags": {},
735
+ "order": 67,
736
+ "mode": 0,
737
+ "inputs": [
738
+ {
739
+ "name": "images",
740
+ "type": "IMAGE",
741
+ "link": 25
742
+ }
743
+ ],
744
+ "properties": {
745
+ "Node name for S&R": "PreviewImage"
746
+ }
747
+ },
748
+ {
749
+ "id": 33,
750
+ "type": "PreviewImage",
751
+ "pos": [
752
+ 1765.5936369564051,
753
+ 1778.5227515553931
754
+ ],
755
+ "size": {
756
+ "0": 237.004638671875,
757
+ "1": 215.01806640625
758
+ },
759
+ "flags": {},
760
+ "order": 73,
761
+ "mode": 0,
762
+ "inputs": [
763
+ {
764
+ "name": "images",
765
+ "type": "IMAGE",
766
+ "link": 30
767
+ }
768
+ ],
769
+ "properties": {
770
+ "Node name for S&R": "PreviewImage"
771
+ }
772
+ },
773
+ {
774
+ "id": 32,
775
+ "type": "Reroute",
776
+ "pos": [
777
+ 1273.5936369564051,
778
+ 1699.5227515553931
779
+ ],
780
+ "size": [
781
+ 90.4,
782
+ 26
783
+ ],
784
+ "flags": {},
785
+ "order": 51,
786
+ "mode": 0,
787
+ "inputs": [
788
+ {
789
+ "name": "",
790
+ "type": "*",
791
+ "link": 28,
792
+ "pos": [
793
+ 45.2,
794
+ 0
795
+ ]
796
+ }
797
+ ],
798
+ "outputs": [
799
+ {
800
+ "name": "OPENAI",
801
+ "type": "OPENAI",
802
+ "links": [
803
+ 29
804
+ ],
805
+ "slot_index": 0
806
+ }
807
+ ],
808
+ "properties": {
809
+ "showOutputText": true,
810
+ "horizontal": true
811
+ }
812
+ },
813
+ {
814
+ "id": 8,
815
+ "type": "ChatGPT Simple _O",
816
+ "pos": [
817
+ 498.8685881282818,
818
+ 788.8676148366416
819
+ ],
820
+ "size": {
821
+ "0": 400,
822
+ "1": 200
823
+ },
824
+ "flags": {},
825
+ "order": 7,
826
+ "mode": 0,
827
+ "outputs": [
828
+ {
829
+ "name": "STRING",
830
+ "type": "STRING",
831
+ "links": [
832
+ 8
833
+ ],
834
+ "slot_index": 0
835
+ }
836
+ ],
837
+ "properties": {
838
+ "Node name for S&R": "ChatGPT Simple _O"
839
+ },
840
+ "widgets_values": [
841
+ "dancng girl",
842
+ "api_key.txt",
843
+ "gpt-3.5-turbo",
844
+ 693269650780473,
845
+ false
846
+ ],
847
+ "color": "#232",
848
+ "bgcolor": "#353"
849
+ },
850
+ {
851
+ "id": 26,
852
+ "type": "create image _O",
853
+ "pos": [
854
+ 521.5936369564068,
855
+ 1782.5227515553931
856
+ ],
857
+ "size": {
858
+ "0": 400,
859
+ "1": 200
860
+ },
861
+ "flags": {},
862
+ "order": 58,
863
+ "mode": 0,
864
+ "inputs": [
865
+ {
866
+ "name": "openai",
867
+ "type": "OPENAI",
868
+ "link": 22
869
+ }
870
+ ],
871
+ "outputs": [
872
+ {
873
+ "name": "IMAGE",
874
+ "type": "IMAGE",
875
+ "links": [
876
+ 25,
877
+ 26
878
+ ],
879
+ "slot_index": 0
880
+ },
881
+ {
882
+ "name": "MASK",
883
+ "type": "MASK",
884
+ "links": null
885
+ }
886
+ ],
887
+ "properties": {
888
+ "Node name for S&R": "create image _O"
889
+ },
890
+ "widgets_values": [
891
+ "dancng girl",
892
+ 1,
893
+ "256x256",
894
+ 0,
895
+ false
896
+ ],
897
+ "color": "#232",
898
+ "bgcolor": "#353"
899
+ },
900
+ {
901
+ "id": 17,
902
+ "type": "Chat completion _O",
903
+ "pos": [
904
+ 972.5936369564066,
905
+ 1475.5227515553931
906
+ ],
907
+ "size": {
908
+ "0": 393,
909
+ "1": 126
910
+ },
911
+ "flags": {},
912
+ "order": 59,
913
+ "mode": 0,
914
+ "inputs": [
915
+ {
916
+ "name": "openai",
917
+ "type": "OPENAI",
918
+ "link": 19
919
+ },
920
+ {
921
+ "name": "messages",
922
+ "type": "OPENAI_CHAT_MESSAGES",
923
+ "link": 15
924
+ }
925
+ ],
926
+ "outputs": [
927
+ {
928
+ "name": "STRING",
929
+ "type": "STRING",
930
+ "links": [
931
+ 16
932
+ ],
933
+ "slot_index": 0
934
+ },
935
+ {
936
+ "name": "OPENAI_CHAT_COMPLETION",
937
+ "type": "OPENAI_CHAT_COMPLETION",
938
+ "links": null
939
+ }
940
+ ],
941
+ "properties": {
942
+ "Node name for S&R": "Chat completion _O"
943
+ },
944
+ "widgets_values": [
945
+ "gpt-3.5-turbo",
946
+ 0,
947
+ false
948
+ ],
949
+ "color": "#232",
950
+ "bgcolor": "#353"
951
+ },
952
+ {
953
+ "id": 35,
954
+ "type": "Note _O",
955
+ "pos": [
956
+ -433.40636304359373,
957
+ 2131.522751555393
958
+ ],
959
+ "size": {
960
+ "0": 400,
961
+ "1": 200
962
+ },
963
+ "flags": {},
964
+ "order": 8,
965
+ "mode": 0,
966
+ "properties": {
967
+ "Node name for S&R": "Note _O"
968
+ },
969
+ "widgets_values": [
970
+ "updates\n - remove the no longer necessary string node \n - add new NSP node\n - enhanced the text2image node\n"
971
+ ],
972
+ "color": "#432",
973
+ "bgcolor": "#653"
974
+ },
975
+ {
976
+ "id": 36,
977
+ "type": "Note _O",
978
+ "pos": [
979
+ 26.593636956406257,
980
+ 2131.522751555393
981
+ ],
982
+ "size": {
983
+ "0": 400,
984
+ "1": 200
985
+ },
986
+ "flags": {},
987
+ "order": 9,
988
+ "mode": 0,
989
+ "properties": {
990
+ "Node name for S&R": "Note _O"
991
+ },
992
+ "widgets_values": [
993
+ "updates\n - this node will select a random value from the NSP file included \n with the Package based on the terminology you select"
994
+ ],
995
+ "color": "#432",
996
+ "bgcolor": "#653"
997
+ },
998
+ {
999
+ "id": 39,
1000
+ "type": "Concat Text _O",
1001
+ "pos": [
1002
+ 989.6058622982029,
1003
+ 2515.5385796803926
1004
+ ],
1005
+ "size": {
1006
+ "0": 255.0090789794922,
1007
+ "1": 78
1008
+ },
1009
+ "flags": {},
1010
+ "order": 48,
1011
+ "mode": 0,
1012
+ "inputs": [
1013
+ {
1014
+ "name": "text1",
1015
+ "type": "STRING",
1016
+ "link": 36,
1017
+ "widget": {
1018
+ "name": "text1",
1019
+ "config": [
1020
+ "STRING",
1021
+ {
1022
+ "multiline": true
1023
+ }
1024
+ ]
1025
+ }
1026
+ },
1027
+ {
1028
+ "name": "text2",
1029
+ "type": "STRING",
1030
+ "link": 37,
1031
+ "widget": {
1032
+ "name": "text2",
1033
+ "config": [
1034
+ "STRING",
1035
+ {
1036
+ "multiline": true
1037
+ }
1038
+ ]
1039
+ }
1040
+ }
1041
+ ],
1042
+ "outputs": [
1043
+ {
1044
+ "name": "STRING",
1045
+ "type": "STRING",
1046
+ "links": [
1047
+ 38
1048
+ ],
1049
+ "slot_index": 0
1050
+ }
1051
+ ],
1052
+ "properties": {
1053
+ "Node name for S&R": "Concat Text _O"
1054
+ },
1055
+ "widgets_values": [
1056
+ "",
1057
+ " at ",
1058
+ ""
1059
+ ]
1060
+ },
1061
+ {
1062
+ "id": 45,
1063
+ "type": "Debug Text _O",
1064
+ "pos": [
1065
+ 1619.6058622982016,
1066
+ 2545.5385796803926
1067
+ ],
1068
+ "size": {
1069
+ "0": 400,
1070
+ "1": 200
1071
+ },
1072
+ "flags": {
1073
+ "collapsed": true
1074
+ },
1075
+ "order": 64,
1076
+ "mode": 0,
1077
+ "inputs": [
1078
+ {
1079
+ "name": "text",
1080
+ "type": "STRING",
1081
+ "link": 39,
1082
+ "widget": {
1083
+ "name": "text",
1084
+ "config": [
1085
+ "STRING",
1086
+ {
1087
+ "multiline": true
1088
+ }
1089
+ ]
1090
+ }
1091
+ }
1092
+ ],
1093
+ "properties": {
1094
+ "Node name for S&R": "Debug Text _O"
1095
+ },
1096
+ "widgets_values": [
1097
+ "",
1098
+ "debug"
1099
+ ]
1100
+ },
1101
+ {
1102
+ "id": 43,
1103
+ "type": "Note _O",
1104
+ "pos": [
1105
+ 987.6058622982031,
1106
+ 2256.5385796803926
1107
+ ],
1108
+ "size": {
1109
+ "0": 261.47479248046875,
1110
+ "1": 202.4876708984375
1111
+ },
1112
+ "flags": {},
1113
+ "order": 10,
1114
+ "mode": 0,
1115
+ "properties": {
1116
+ "Node name for S&R": "Note _O"
1117
+ },
1118
+ "widgets_values": [
1119
+ " - combine two text inputs to one text also \n it will add the separator in the middle"
1120
+ ],
1121
+ "color": "#432",
1122
+ "bgcolor": "#653"
1123
+ },
1124
+ {
1125
+ "id": 46,
1126
+ "type": "Note _O",
1127
+ "pos": [
1128
+ 1297.6058622982016,
1129
+ 2255.5385796803926
1130
+ ],
1131
+ "size": {
1132
+ "0": 261.47479248046875,
1133
+ "1": 202.4876708984375
1134
+ },
1135
+ "flags": {},
1136
+ "order": 11,
1137
+ "mode": 0,
1138
+ "properties": {
1139
+ "Node name for S&R": "Note _O"
1140
+ },
1141
+ "widgets_values": [
1142
+ " - replaces all occurrences of (old) with \n the new) value "
1143
+ ],
1144
+ "color": "#432",
1145
+ "bgcolor": "#653"
1146
+ },
1147
+ {
1148
+ "id": 41,
1149
+ "type": "Trim Text _O",
1150
+ "pos": [
1151
+ 689.6058622982031,
1152
+ 2515.5385796803926
1153
+ ],
1154
+ "size": {
1155
+ "0": 239.58309936523438,
1156
+ "1": 34
1157
+ },
1158
+ "flags": {},
1159
+ "order": 42,
1160
+ "mode": 0,
1161
+ "inputs": [
1162
+ {
1163
+ "name": "text",
1164
+ "type": "STRING",
1165
+ "link": 35,
1166
+ "widget": {
1167
+ "name": "text",
1168
+ "config": [
1169
+ "STRING",
1170
+ {
1171
+ "multiline": true
1172
+ }
1173
+ ]
1174
+ }
1175
+ }
1176
+ ],
1177
+ "outputs": [
1178
+ {
1179
+ "name": "STRING",
1180
+ "type": "STRING",
1181
+ "links": [
1182
+ 36
1183
+ ],
1184
+ "slot_index": 0
1185
+ }
1186
+ ],
1187
+ "properties": {
1188
+ "Node name for S&R": "Trim Text _O"
1189
+ },
1190
+ "widgets_values": [
1191
+ ""
1192
+ ]
1193
+ },
1194
+ {
1195
+ "id": 47,
1196
+ "type": "Note _O",
1197
+ "pos": [
1198
+ 1596.6058622982016,
1199
+ 2262.5385796803926
1200
+ ],
1201
+ "size": {
1202
+ "0": 210,
1203
+ "1": 175.2283172607422
1204
+ },
1205
+ "flags": {},
1206
+ "order": 12,
1207
+ "mode": 0,
1208
+ "properties": {
1209
+ "Node name for S&R": "Note _O"
1210
+ },
1211
+ "widgets_values": [
1212
+ " - debug will write text to the \n console screen"
1213
+ ],
1214
+ "color": "#432",
1215
+ "bgcolor": "#653"
1216
+ },
1217
+ {
1218
+ "id": 44,
1219
+ "type": "Replace Text _O",
1220
+ "pos": [
1221
+ 1279.6058622982016,
1222
+ 2515.5385796803926
1223
+ ],
1224
+ "size": {
1225
+ "0": 289.0200500488281,
1226
+ "1": 83.88400268554688
1227
+ },
1228
+ "flags": {},
1229
+ "order": 56,
1230
+ "mode": 0,
1231
+ "inputs": [
1232
+ {
1233
+ "name": "text",
1234
+ "type": "STRING",
1235
+ "link": 38,
1236
+ "widget": {
1237
+ "name": "text",
1238
+ "config": [
1239
+ "STRING",
1240
+ {
1241
+ "multiline": true
1242
+ }
1243
+ ]
1244
+ }
1245
+ }
1246
+ ],
1247
+ "outputs": [
1248
+ {
1249
+ "name": "STRING",
1250
+ "type": "STRING",
1251
+ "links": [
1252
+ 39,
1253
+ 41
1254
+ ],
1255
+ "slot_index": 0
1256
+ }
1257
+ ],
1258
+ "properties": {
1259
+ "Node name for S&R": "Replace Text _O"
1260
+ },
1261
+ "widgets_values": [
1262
+ "",
1263
+ "Wizard",
1264
+ "Witch"
1265
+ ]
1266
+ },
1267
+ {
1268
+ "id": 52,
1269
+ "type": "Note _O",
1270
+ "pos": [
1271
+ 46.593636956406215,
1272
+ 3032.522751555393
1273
+ ],
1274
+ "size": {
1275
+ "0": 261.47479248046875,
1276
+ "1": 202.4876708984375
1277
+ },
1278
+ "flags": {},
1279
+ "order": 13,
1280
+ "mode": 0,
1281
+ "properties": {
1282
+ "Node name for S&R": "Note _O"
1283
+ },
1284
+ "widgets_values": [
1285
+ " - use text instead of String\n - allow transparent text and BG\n - allow you to set image size\n - expand: this option will resize the \n result image to fit the text if \n the image don't fit \n - x,y is to move the text around the image \n it points to text center "
1286
+ ],
1287
+ "color": "#432",
1288
+ "bgcolor": "#653"
1289
+ },
1290
+ {
1291
+ "id": 51,
1292
+ "type": "PreviewImage",
1293
+ "pos": [
1294
+ 834.5936369564066,
1295
+ 3057.522751555393
1296
+ ],
1297
+ "size": {
1298
+ "0": 1037.8057861328125,
1299
+ "1": 210.6422882080078
1300
+ },
1301
+ "flags": {},
1302
+ "order": 76,
1303
+ "mode": 0,
1304
+ "inputs": [
1305
+ {
1306
+ "name": "images",
1307
+ "type": "IMAGE",
1308
+ "link": 45
1309
+ }
1310
+ ],
1311
+ "properties": {
1312
+ "Node name for S&R": "PreviewImage"
1313
+ }
1314
+ },
1315
+ {
1316
+ "id": 42,
1317
+ "type": "Note _O",
1318
+ "pos": [
1319
+ 670.5936369564066,
1320
+ 2256.522751555393
1321
+ ],
1322
+ "size": {
1323
+ "0": 261.47479248046875,
1324
+ "1": 202.4876708984375
1325
+ },
1326
+ "flags": {},
1327
+ "order": 14,
1328
+ "mode": 0,
1329
+ "properties": {
1330
+ "Node name for S&R": "Note _O"
1331
+ },
1332
+ "widgets_values": [
1333
+ " - trim removes any extra spaces after or \n before the text if found"
1334
+ ],
1335
+ "color": "#432",
1336
+ "bgcolor": "#653"
1337
+ },
1338
+ {
1339
+ "id": 50,
1340
+ "type": "Reroute",
1341
+ "pos": [
1342
+ 172.59363695640633,
1343
+ 2926.522751555393
1344
+ ],
1345
+ "size": [
1346
+ 75,
1347
+ 26
1348
+ ],
1349
+ "flags": {},
1350
+ "order": 72,
1351
+ "mode": 0,
1352
+ "inputs": [
1353
+ {
1354
+ "name": "",
1355
+ "type": "*",
1356
+ "link": 46,
1357
+ "pos": [
1358
+ 37.5,
1359
+ 0
1360
+ ]
1361
+ }
1362
+ ],
1363
+ "outputs": [
1364
+ {
1365
+ "name": "",
1366
+ "type": "STRING",
1367
+ "links": [
1368
+ 44
1369
+ ]
1370
+ }
1371
+ ],
1372
+ "properties": {
1373
+ "showOutputText": false,
1374
+ "horizontal": true
1375
+ }
1376
+ },
1377
+ {
1378
+ "id": 49,
1379
+ "type": "Reroute",
1380
+ "pos": [
1381
+ 1856.5936369564051,
1382
+ 2801.522751555393
1383
+ ],
1384
+ "size": [
1385
+ 75,
1386
+ 26
1387
+ ],
1388
+ "flags": {},
1389
+ "order": 65,
1390
+ "mode": 0,
1391
+ "inputs": [
1392
+ {
1393
+ "name": "",
1394
+ "type": "*",
1395
+ "link": 41,
1396
+ "pos": [
1397
+ 37.5,
1398
+ 0
1399
+ ]
1400
+ }
1401
+ ],
1402
+ "outputs": [
1403
+ {
1404
+ "name": "",
1405
+ "type": "STRING",
1406
+ "links": [
1407
+ 46
1408
+ ],
1409
+ "slot_index": 0
1410
+ }
1411
+ ],
1412
+ "properties": {
1413
+ "showOutputText": false,
1414
+ "horizontal": true
1415
+ }
1416
+ },
1417
+ {
1418
+ "id": 53,
1419
+ "type": "Note _O",
1420
+ "pos": [
1421
+ 78.44912461814945,
1422
+ 4266.738520461633
1423
+ ],
1424
+ "size": {
1425
+ "0": 400,
1426
+ "1": 200
1427
+ },
1428
+ "flags": {},
1429
+ "order": 15,
1430
+ "mode": 0,
1431
+ "properties": {
1432
+ "Node name for S&R": "Note _O"
1433
+ },
1434
+ "widgets_values": [
1435
+ ""
1436
+ ]
1437
+ },
1438
+ {
1439
+ "id": 56,
1440
+ "type": "int _O",
1441
+ "pos": [
1442
+ 1552.4491246181487,
1443
+ 4286.738520461633
1444
+ ],
1445
+ "size": {
1446
+ "0": 315,
1447
+ "1": 58
1448
+ },
1449
+ "flags": {},
1450
+ "order": 16,
1451
+ "mode": 0,
1452
+ "outputs": [
1453
+ {
1454
+ "name": "INT",
1455
+ "type": "INT",
1456
+ "links": null
1457
+ }
1458
+ ],
1459
+ "properties": {
1460
+ "Node name for S&R": "int _O"
1461
+ },
1462
+ "widgets_values": [
1463
+ 0
1464
+ ]
1465
+ },
1466
+ {
1467
+ "id": 55,
1468
+ "type": "seed _O",
1469
+ "pos": [
1470
+ 1091.4491246181487,
1471
+ 4277.738520461633
1472
+ ],
1473
+ "size": {
1474
+ "0": 315,
1475
+ "1": 82
1476
+ },
1477
+ "flags": {},
1478
+ "order": 17,
1479
+ "mode": 0,
1480
+ "outputs": [
1481
+ {
1482
+ "name": "INT",
1483
+ "type": "INT",
1484
+ "links": null
1485
+ }
1486
+ ],
1487
+ "properties": {
1488
+ "Node name for S&R": "seed _O"
1489
+ },
1490
+ "widgets_values": [
1491
+ 36410646047828,
1492
+ true
1493
+ ]
1494
+ },
1495
+ {
1496
+ "id": 54,
1497
+ "type": "Text _O",
1498
+ "pos": [
1499
+ 576.4491246181497,
1500
+ 4268.738520461633
1501
+ ],
1502
+ "size": {
1503
+ "0": 400,
1504
+ "1": 200
1505
+ },
1506
+ "flags": {},
1507
+ "order": 18,
1508
+ "mode": 0,
1509
+ "outputs": [
1510
+ {
1511
+ "name": "STRING",
1512
+ "type": "STRING",
1513
+ "links": null
1514
+ }
1515
+ ],
1516
+ "properties": {
1517
+ "Node name for S&R": "Text _O"
1518
+ },
1519
+ "widgets_values": [
1520
+ ""
1521
+ ]
1522
+ },
1523
+ {
1524
+ "id": 58,
1525
+ "type": "Note _O",
1526
+ "pos": [
1527
+ 158.4491246181495,
1528
+ 4100.738520461642
1529
+ ],
1530
+ "size": {
1531
+ "0": 229.99794006347656,
1532
+ "1": 103.36981201171875
1533
+ },
1534
+ "flags": {},
1535
+ "order": 19,
1536
+ "mode": 0,
1537
+ "properties": {
1538
+ "Node name for S&R": "Note _O"
1539
+ },
1540
+ "widgets_values": [
1541
+ "an empty node that can be used to write notes X) \n"
1542
+ ],
1543
+ "color": "#432",
1544
+ "bgcolor": "#653"
1545
+ },
1546
+ {
1547
+ "id": 59,
1548
+ "type": "Note _O",
1549
+ "pos": [
1550
+ 657.4491246181497,
1551
+ 4099.738520461642
1552
+ ],
1553
+ "size": {
1554
+ "0": 229.99794006347656,
1555
+ "1": 103.36981201171875
1556
+ },
1557
+ "flags": {},
1558
+ "order": 20,
1559
+ "mode": 0,
1560
+ "properties": {
1561
+ "Node name for S&R": "Note _O"
1562
+ },
1563
+ "widgets_values": [
1564
+ "text input node"
1565
+ ],
1566
+ "color": "#432",
1567
+ "bgcolor": "#653"
1568
+ },
1569
+ {
1570
+ "id": 60,
1571
+ "type": "Note _O",
1572
+ "pos": [
1573
+ 1133.4491246181487,
1574
+ 4100.738520461642
1575
+ ],
1576
+ "size": {
1577
+ "0": 229.99794006347656,
1578
+ "1": 103.36981201171875
1579
+ },
1580
+ "flags": {},
1581
+ "order": 21,
1582
+ "mode": 0,
1583
+ "properties": {
1584
+ "Node name for S&R": "Note _O"
1585
+ },
1586
+ "widgets_values": [
1587
+ "seed input node"
1588
+ ],
1589
+ "color": "#432",
1590
+ "bgcolor": "#653"
1591
+ },
1592
+ {
1593
+ "id": 61,
1594
+ "type": "Note _O",
1595
+ "pos": [
1596
+ 1596.4491246181487,
1597
+ 4101.738520461642
1598
+ ],
1599
+ "size": {
1600
+ "0": 229.99794006347656,
1601
+ "1": 103.36981201171875
1602
+ },
1603
+ "flags": {},
1604
+ "order": 22,
1605
+ "mode": 0,
1606
+ "properties": {
1607
+ "Node name for S&R": "Note _O"
1608
+ },
1609
+ "widgets_values": [
1610
+ "number input nodes"
1611
+ ],
1612
+ "color": "#432",
1613
+ "bgcolor": "#653"
1614
+ },
1615
+ {
1616
+ "id": 57,
1617
+ "type": "float _O",
1618
+ "pos": [
1619
+ 1556.4491246181487,
1620
+ 4417.738520461633
1621
+ ],
1622
+ "size": {
1623
+ "0": 315,
1624
+ "1": 58
1625
+ },
1626
+ "flags": {},
1627
+ "order": 23,
1628
+ "mode": 0,
1629
+ "outputs": [
1630
+ {
1631
+ "name": "FLOAT",
1632
+ "type": "FLOAT",
1633
+ "links": null
1634
+ }
1635
+ ],
1636
+ "properties": {
1637
+ "Node name for S&R": "float _O"
1638
+ },
1639
+ "widgets_values": [
1640
+ 0
1641
+ ]
1642
+ },
1643
+ {
1644
+ "id": 31,
1645
+ "type": "variation_image _O",
1646
+ "pos": [
1647
+ 1386.5936369564051,
1648
+ 1771.5227515553931
1649
+ ],
1650
+ "size": {
1651
+ "0": 315,
1652
+ "1": 150
1653
+ },
1654
+ "flags": {},
1655
+ "order": 68,
1656
+ "mode": 0,
1657
+ "inputs": [
1658
+ {
1659
+ "name": "openai",
1660
+ "type": "OPENAI",
1661
+ "link": 29
1662
+ },
1663
+ {
1664
+ "name": "image",
1665
+ "type": "IMAGE",
1666
+ "link": 26
1667
+ }
1668
+ ],
1669
+ "outputs": [
1670
+ {
1671
+ "name": "IMAGE",
1672
+ "type": "IMAGE",
1673
+ "links": [
1674
+ 30
1675
+ ],
1676
+ "slot_index": 0
1677
+ },
1678
+ {
1679
+ "name": "MASK",
1680
+ "type": "MASK",
1681
+ "links": null
1682
+ }
1683
+ ],
1684
+ "properties": {
1685
+ "Node name for S&R": "variation_image _O"
1686
+ },
1687
+ "widgets_values": [
1688
+ 1,
1689
+ "256x256",
1690
+ 0,
1691
+ false
1692
+ ],
1693
+ "color": "#232",
1694
+ "bgcolor": "#353"
1695
+ },
1696
+ {
1697
+ "id": 9,
1698
+ "type": "Note _O",
1699
+ "pos": [
1700
+ -438.375,
1701
+ 36.28409090909091
1702
+ ],
1703
+ "size": {
1704
+ "0": 400,
1705
+ "1": 200
1706
+ },
1707
+ "flags": {},
1708
+ "order": 24,
1709
+ "mode": 0,
1710
+ "properties": {
1711
+ "Node name for S&R": "Note _O"
1712
+ },
1713
+ "widgets_values": [
1714
+ "In this example, you can write your equation to be applied on the input "
1715
+ ],
1716
+ "color": "#432",
1717
+ "bgcolor": "#653"
1718
+ },
1719
+ {
1720
+ "id": 10,
1721
+ "type": "Note _O",
1722
+ "pos": [
1723
+ 27,
1724
+ 732
1725
+ ],
1726
+ "size": {
1727
+ "0": 400,
1728
+ "1": 200
1729
+ },
1730
+ "flags": {},
1731
+ "order": 25,
1732
+ "mode": 0,
1733
+ "properties": {
1734
+ "Node name for S&R": "Note _O"
1735
+ },
1736
+ "widgets_values": [
1737
+ "ChatGPT updates\n - support selecting the model \n so if you have access to gpt-4 you can use it\n\n - add a seed input (it is not a real seed) but it is used to make \n the node generate new input \n\n\n//note: using ChatGPT with revAnimated or mistoonAnime checkpoints produce stunning accurate results"
1738
+ ],
1739
+ "color": "#432",
1740
+ "bgcolor": "#653"
1741
+ },
1742
+ {
1743
+ "id": 23,
1744
+ "type": "Note _O",
1745
+ "pos": [
1746
+ -428,
1747
+ -452
1748
+ ],
1749
+ "size": {
1750
+ "0": 400,
1751
+ "1": 200
1752
+ },
1753
+ "flags": {},
1754
+ "order": 26,
1755
+ "mode": 0,
1756
+ "properties": {
1757
+ "Node name for S&R": "Note _O"
1758
+ },
1759
+ "widgets_values": [
1760
+ "Thanks for using my tools \n\n- kindly notice that the green colored nodes are the new updates in \n this version"
1761
+ ],
1762
+ "color": "#432",
1763
+ "bgcolor": "#653"
1764
+ },
1765
+ {
1766
+ "id": 21,
1767
+ "type": "Debug Text _O",
1768
+ "pos": [
1769
+ 1460,
1770
+ 1541
1771
+ ],
1772
+ "size": {
1773
+ "0": 210,
1774
+ "1": 58
1775
+ },
1776
+ "flags": {
1777
+ "collapsed": true
1778
+ },
1779
+ "order": 69,
1780
+ "mode": 0,
1781
+ "inputs": [
1782
+ {
1783
+ "name": "text",
1784
+ "type": "STRING",
1785
+ "link": 16,
1786
+ "widget": {
1787
+ "name": "text",
1788
+ "config": [
1789
+ "STRING",
1790
+ {
1791
+ "multiline": true
1792
+ }
1793
+ ]
1794
+ }
1795
+ }
1796
+ ],
1797
+ "properties": {
1798
+ "Node name for S&R": "Debug Text _O"
1799
+ },
1800
+ "widgets_values": [
1801
+ "",
1802
+ "debug"
1803
+ ]
1804
+ },
1805
+ {
1806
+ "id": 63,
1807
+ "type": "ImageScaleFactor _O",
1808
+ "pos": [
1809
+ 1433,
1810
+ 351
1811
+ ],
1812
+ "size": {
1813
+ "0": 315,
1814
+ "1": 154
1815
+ },
1816
+ "flags": {},
1817
+ "order": 27,
1818
+ "mode": 0,
1819
+ "inputs": [
1820
+ {
1821
+ "name": "image",
1822
+ "type": "IMAGE",
1823
+ "link": null
1824
+ }
1825
+ ],
1826
+ "outputs": [
1827
+ {
1828
+ "name": "IMAGE",
1829
+ "type": "IMAGE",
1830
+ "links": null
1831
+ }
1832
+ ],
1833
+ "properties": {
1834
+ "Node name for S&R": "ImageScaleFactor _O"
1835
+ },
1836
+ "widgets_values": [
1837
+ "nearest-exact",
1838
+ 1.25,
1839
+ 1.25,
1840
+ "enabled",
1841
+ "disabled"
1842
+ ]
1843
+ },
1844
+ {
1845
+ "id": 67,
1846
+ "type": "Note _O",
1847
+ "pos": [
1848
+ 1389,
1849
+ 90
1850
+ ],
1851
+ "size": {
1852
+ "0": 400,
1853
+ "1": 200
1854
+ },
1855
+ "flags": {},
1856
+ "order": 28,
1857
+ "mode": 0,
1858
+ "properties": {
1859
+ "Node name for S&R": "Note _O"
1860
+ },
1861
+ "widgets_values": [
1862
+ "Upscale image using factors "
1863
+ ],
1864
+ "color": "#432",
1865
+ "bgcolor": "#653"
1866
+ },
1867
+ {
1868
+ "id": 64,
1869
+ "type": "Note _O",
1870
+ "pos": [
1871
+ -325,
1872
+ 3465
1873
+ ],
1874
+ "size": [
1875
+ 305.0923840439583,
1876
+ 101.80222838475856
1877
+ ],
1878
+ "flags": {},
1879
+ "order": 29,
1880
+ "mode": 0,
1881
+ "properties": {
1882
+ "Node name for S&R": "Note _O"
1883
+ },
1884
+ "widgets_values": [
1885
+ "latent tools \n - new node added SelectLatentFromBatch_O\n\nit is useful if you want to select an image to continue working on after generating multiple images "
1886
+ ],
1887
+ "color": "#432",
1888
+ "bgcolor": "#653"
1889
+ },
1890
+ {
1891
+ "id": 65,
1892
+ "type": "Note _O",
1893
+ "pos": [
1894
+ -288,
1895
+ 4046
1896
+ ],
1897
+ "size": {
1898
+ "0": 229.99794006347656,
1899
+ "1": 103.36981201171875
1900
+ },
1901
+ "flags": {},
1902
+ "order": 30,
1903
+ "mode": 0,
1904
+ "properties": {
1905
+ "Node name for S&R": "Note _O"
1906
+ },
1907
+ "widgets_values": [
1908
+ "utility nodes\n\n- the input nodes good if you want to \n reroute after them as currently the \n primitive node dost work with \n reroute nodes "
1909
+ ],
1910
+ "color": "#432",
1911
+ "bgcolor": "#653"
1912
+ },
1913
+ {
1914
+ "id": 69,
1915
+ "type": "CheckpointLoaderSimple",
1916
+ "pos": [
1917
+ 38,
1918
+ 3519
1919
+ ],
1920
+ "size": {
1921
+ "0": 210,
1922
+ "1": 98
1923
+ },
1924
+ "flags": {},
1925
+ "order": 31,
1926
+ "mode": 0,
1927
+ "outputs": [
1928
+ {
1929
+ "name": "MODEL",
1930
+ "type": "MODEL",
1931
+ "links": [
1932
+ 47
1933
+ ],
1934
+ "slot_index": 0
1935
+ },
1936
+ {
1937
+ "name": "CLIP",
1938
+ "type": "CLIP",
1939
+ "links": [
1940
+ 53,
1941
+ 54
1942
+ ],
1943
+ "slot_index": 1
1944
+ },
1945
+ {
1946
+ "name": "VAE",
1947
+ "type": "VAE",
1948
+ "links": [
1949
+ 59
1950
+ ],
1951
+ "slot_index": 2
1952
+ }
1953
+ ],
1954
+ "properties": {
1955
+ "Node name for S&R": "CheckpointLoaderSimple"
1956
+ },
1957
+ "widgets_values": [
1958
+ "sd-v1-4.ckpt"
1959
+ ]
1960
+ },
1961
+ {
1962
+ "id": 70,
1963
+ "type": "CLIPTextEncode",
1964
+ "pos": [
1965
+ 43,
1966
+ 3657
1967
+ ],
1968
+ "size": [
1969
+ 210,
1970
+ 76.00001335144043
1971
+ ],
1972
+ "flags": {},
1973
+ "order": 39,
1974
+ "mode": 0,
1975
+ "inputs": [
1976
+ {
1977
+ "name": "clip",
1978
+ "type": "CLIP",
1979
+ "link": 53
1980
+ }
1981
+ ],
1982
+ "outputs": [
1983
+ {
1984
+ "name": "CONDITIONING",
1985
+ "type": "CONDITIONING",
1986
+ "links": [
1987
+ 48
1988
+ ]
1989
+ }
1990
+ ],
1991
+ "properties": {
1992
+ "Node name for S&R": "CLIPTextEncode"
1993
+ },
1994
+ "widgets_values": [
1995
+ "cute girl "
1996
+ ]
1997
+ },
1998
+ {
1999
+ "id": 71,
2000
+ "type": "CLIPTextEncode",
2001
+ "pos": [
2002
+ 48,
2003
+ 3770
2004
+ ],
2005
+ "size": {
2006
+ "0": 210,
2007
+ "1": 76.00001525878906
2008
+ },
2009
+ "flags": {},
2010
+ "order": 40,
2011
+ "mode": 0,
2012
+ "inputs": [
2013
+ {
2014
+ "name": "clip",
2015
+ "type": "CLIP",
2016
+ "link": 54
2017
+ }
2018
+ ],
2019
+ "outputs": [
2020
+ {
2021
+ "name": "CONDITIONING",
2022
+ "type": "CONDITIONING",
2023
+ "links": [
2024
+ 49
2025
+ ],
2026
+ "slot_index": 0
2027
+ }
2028
+ ],
2029
+ "properties": {
2030
+ "Node name for S&R": "CLIPTextEncode"
2031
+ },
2032
+ "widgets_values": [
2033
+ "bad hands "
2034
+ ]
2035
+ },
2036
+ {
2037
+ "id": 72,
2038
+ "type": "EmptyLatentImage",
2039
+ "pos": [
2040
+ 49,
2041
+ 3883
2042
+ ],
2043
+ "size": {
2044
+ "0": 210,
2045
+ "1": 106
2046
+ },
2047
+ "flags": {},
2048
+ "order": 32,
2049
+ "mode": 0,
2050
+ "outputs": [
2051
+ {
2052
+ "name": "LATENT",
2053
+ "type": "LATENT",
2054
+ "links": [
2055
+ 50
2056
+ ]
2057
+ }
2058
+ ],
2059
+ "properties": {
2060
+ "Node name for S&R": "EmptyLatentImage"
2061
+ },
2062
+ "widgets_values": [
2063
+ 512,
2064
+ 512,
2065
+ 4
2066
+ ],
2067
+ "color": "#323",
2068
+ "bgcolor": "#535"
2069
+ },
2070
+ {
2071
+ "id": 73,
2072
+ "type": "VAEDecode",
2073
+ "pos": [
2074
+ 739.204545454546,
2075
+ 3541.295454545454
2076
+ ],
2077
+ "size": {
2078
+ "0": 140,
2079
+ "1": 46
2080
+ },
2081
+ "flags": {},
2082
+ "order": 54,
2083
+ "mode": 0,
2084
+ "inputs": [
2085
+ {
2086
+ "name": "samples",
2087
+ "type": "LATENT",
2088
+ "link": 51
2089
+ },
2090
+ {
2091
+ "name": "vae",
2092
+ "type": "VAE",
2093
+ "link": 69
2094
+ }
2095
+ ],
2096
+ "outputs": [
2097
+ {
2098
+ "name": "IMAGE",
2099
+ "type": "IMAGE",
2100
+ "links": [
2101
+ 55
2102
+ ],
2103
+ "slot_index": 0
2104
+ }
2105
+ ],
2106
+ "properties": {
2107
+ "Node name for S&R": "VAEDecode"
2108
+ }
2109
+ },
2110
+ {
2111
+ "id": 77,
2112
+ "type": "Reroute",
2113
+ "pos": [
2114
+ 254,
2115
+ 3464
2116
+ ],
2117
+ "size": [
2118
+ 75,
2119
+ 26
2120
+ ],
2121
+ "flags": {},
2122
+ "order": 41,
2123
+ "mode": 0,
2124
+ "inputs": [
2125
+ {
2126
+ "name": "",
2127
+ "type": "*",
2128
+ "link": 59
2129
+ }
2130
+ ],
2131
+ "outputs": [
2132
+ {
2133
+ "name": "VAE",
2134
+ "type": "VAE",
2135
+ "links": [
2136
+ 62
2137
+ ],
2138
+ "slot_index": 0
2139
+ }
2140
+ ],
2141
+ "properties": {
2142
+ "showOutputText": true,
2143
+ "horizontal": false
2144
+ }
2145
+ },
2146
+ {
2147
+ "id": 78,
2148
+ "type": "Reroute",
2149
+ "pos": [
2150
+ 657.204545454546,
2151
+ 3467.295454545454
2152
+ ],
2153
+ "size": [
2154
+ 75,
2155
+ 26
2156
+ ],
2157
+ "flags": {},
2158
+ "order": 47,
2159
+ "mode": 0,
2160
+ "inputs": [
2161
+ {
2162
+ "name": "",
2163
+ "type": "*",
2164
+ "link": 62
2165
+ }
2166
+ ],
2167
+ "outputs": [
2168
+ {
2169
+ "name": "VAE",
2170
+ "type": "VAE",
2171
+ "links": [
2172
+ 69,
2173
+ 78
2174
+ ],
2175
+ "slot_index": 0
2176
+ }
2177
+ ],
2178
+ "properties": {
2179
+ "showOutputText": true,
2180
+ "horizontal": false
2181
+ }
2182
+ },
2183
+ {
2184
+ "id": 74,
2185
+ "type": "PreviewImage",
2186
+ "pos": [
2187
+ 895,
2188
+ 3458
2189
+ ],
2190
+ "size": {
2191
+ "0": 210,
2192
+ "1": 250
2193
+ },
2194
+ "flags": {},
2195
+ "order": 61,
2196
+ "mode": 0,
2197
+ "inputs": [
2198
+ {
2199
+ "name": "images",
2200
+ "type": "IMAGE",
2201
+ "link": 55
2202
+ }
2203
+ ],
2204
+ "properties": {
2205
+ "Node name for S&R": "PreviewImage"
2206
+ }
2207
+ },
2208
+ {
2209
+ "id": 76,
2210
+ "type": "PreviewImage",
2211
+ "pos": [
2212
+ 1382,
2213
+ 3560.75
2214
+ ],
2215
+ "size": {
2216
+ "0": 210,
2217
+ "1": 250
2218
+ },
2219
+ "flags": {},
2220
+ "order": 70,
2221
+ "mode": 0,
2222
+ "inputs": [
2223
+ {
2224
+ "name": "images",
2225
+ "type": "IMAGE",
2226
+ "link": 58
2227
+ }
2228
+ ],
2229
+ "properties": {
2230
+ "Node name for S&R": "PreviewImage"
2231
+ }
2232
+ },
2233
+ {
2234
+ "id": 80,
2235
+ "type": "VAEDecode",
2236
+ "pos": [
2237
+ 1640,
2238
+ 3850
2239
+ ],
2240
+ "size": {
2241
+ "0": 140,
2242
+ "1": 46
2243
+ },
2244
+ "flags": {},
2245
+ "order": 71,
2246
+ "mode": 0,
2247
+ "inputs": [
2248
+ {
2249
+ "name": "samples",
2250
+ "type": "LATENT",
2251
+ "link": 66
2252
+ },
2253
+ {
2254
+ "name": "vae",
2255
+ "type": "VAE",
2256
+ "link": 77
2257
+ }
2258
+ ],
2259
+ "outputs": [
2260
+ {
2261
+ "name": "IMAGE",
2262
+ "type": "IMAGE",
2263
+ "links": [
2264
+ 68
2265
+ ],
2266
+ "slot_index": 0
2267
+ }
2268
+ ],
2269
+ "properties": {
2270
+ "Node name for S&R": "VAEDecode"
2271
+ }
2272
+ },
2273
+ {
2274
+ "id": 81,
2275
+ "type": "PreviewImage",
2276
+ "pos": [
2277
+ 1797,
2278
+ 3726
2279
+ ],
2280
+ "size": {
2281
+ "0": 210,
2282
+ "1": 250
2283
+ },
2284
+ "flags": {},
2285
+ "order": 74,
2286
+ "mode": 0,
2287
+ "inputs": [
2288
+ {
2289
+ "name": "images",
2290
+ "type": "IMAGE",
2291
+ "link": 68
2292
+ }
2293
+ ],
2294
+ "properties": {
2295
+ "Node name for S&R": "PreviewImage"
2296
+ }
2297
+ },
2298
+ {
2299
+ "id": 62,
2300
+ "type": "LatentUpscaleFactor _O",
2301
+ "pos": [
2302
+ 838,
2303
+ 3848
2304
+ ],
2305
+ "size": {
2306
+ "0": 315,
2307
+ "1": 130
2308
+ },
2309
+ "flags": {},
2310
+ "order": 60,
2311
+ "mode": 0,
2312
+ "inputs": [
2313
+ {
2314
+ "name": "samples",
2315
+ "type": "LATENT",
2316
+ "link": 65
2317
+ }
2318
+ ],
2319
+ "outputs": [
2320
+ {
2321
+ "name": "LATENT",
2322
+ "type": "LATENT",
2323
+ "links": [
2324
+ 66
2325
+ ],
2326
+ "slot_index": 0
2327
+ }
2328
+ ],
2329
+ "properties": {
2330
+ "Node name for S&R": "LatentUpscaleFactor _O"
2331
+ },
2332
+ "widgets_values": [
2333
+ "bilinear",
2334
+ 1.25,
2335
+ 1.25,
2336
+ "disabled"
2337
+ ],
2338
+ "color": "#232",
2339
+ "bgcolor": "#353"
2340
+ },
2341
+ {
2342
+ "id": 68,
2343
+ "type": "KSampler",
2344
+ "pos": [
2345
+ 284,
2346
+ 3540
2347
+ ],
2348
+ "size": [
2349
+ 210,
2350
+ 430.00315768659584
2351
+ ],
2352
+ "flags": {},
2353
+ "order": 46,
2354
+ "mode": 0,
2355
+ "inputs": [
2356
+ {
2357
+ "name": "model",
2358
+ "type": "MODEL",
2359
+ "link": 47
2360
+ },
2361
+ {
2362
+ "name": "positive",
2363
+ "type": "CONDITIONING",
2364
+ "link": 48,
2365
+ "slot_index": 1
2366
+ },
2367
+ {
2368
+ "name": "negative",
2369
+ "type": "CONDITIONING",
2370
+ "link": 49
2371
+ },
2372
+ {
2373
+ "name": "latent_image",
2374
+ "type": "LATENT",
2375
+ "link": 50,
2376
+ "slot_index": 3
2377
+ }
2378
+ ],
2379
+ "outputs": [
2380
+ {
2381
+ "name": "LATENT",
2382
+ "type": "LATENT",
2383
+ "links": [
2384
+ 51,
2385
+ 56
2386
+ ],
2387
+ "slot_index": 0
2388
+ }
2389
+ ],
2390
+ "properties": {
2391
+ "Node name for S&R": "KSampler"
2392
+ },
2393
+ "widgets_values": [
2394
+ 1020066313120726,
2395
+ false,
2396
+ 20,
2397
+ 8,
2398
+ "euler",
2399
+ "karras",
2400
+ 1
2401
+ ]
2402
+ },
2403
+ {
2404
+ "id": 66,
2405
+ "type": "selectLatentFromBatch _O",
2406
+ "pos": [
2407
+ 571,
2408
+ 3721
2409
+ ],
2410
+ "size": {
2411
+ "0": 210,
2412
+ "1": 58
2413
+ },
2414
+ "flags": {},
2415
+ "order": 53,
2416
+ "mode": 0,
2417
+ "inputs": [
2418
+ {
2419
+ "name": "samples",
2420
+ "type": "LATENT",
2421
+ "link": 56
2422
+ }
2423
+ ],
2424
+ "outputs": [
2425
+ {
2426
+ "name": "LATENT",
2427
+ "type": "LATENT",
2428
+ "links": [
2429
+ 57,
2430
+ 65
2431
+ ],
2432
+ "slot_index": 0
2433
+ }
2434
+ ],
2435
+ "properties": {
2436
+ "Node name for S&R": "selectLatentFromBatch _O"
2437
+ },
2438
+ "widgets_values": [
2439
+ 2
2440
+ ],
2441
+ "color": "#232",
2442
+ "bgcolor": "#353"
2443
+ },
2444
+ {
2445
+ "id": 75,
2446
+ "type": "VAEDecode",
2447
+ "pos": [
2448
+ 1203,
2449
+ 3721
2450
+ ],
2451
+ "size": {
2452
+ "0": 140,
2453
+ "1": 46
2454
+ },
2455
+ "flags": {},
2456
+ "order": 62,
2457
+ "mode": 0,
2458
+ "inputs": [
2459
+ {
2460
+ "name": "samples",
2461
+ "type": "LATENT",
2462
+ "link": 57
2463
+ },
2464
+ {
2465
+ "name": "vae",
2466
+ "type": "VAE",
2467
+ "link": 75
2468
+ }
2469
+ ],
2470
+ "outputs": [
2471
+ {
2472
+ "name": "IMAGE",
2473
+ "type": "IMAGE",
2474
+ "links": [
2475
+ 58
2476
+ ],
2477
+ "slot_index": 0
2478
+ }
2479
+ ],
2480
+ "properties": {
2481
+ "Node name for S&R": "VAEDecode"
2482
+ }
2483
+ },
2484
+ {
2485
+ "id": 83,
2486
+ "type": "Reroute",
2487
+ "pos": [
2488
+ 1129,
2489
+ 3471
2490
+ ],
2491
+ "size": [
2492
+ 75,
2493
+ 26
2494
+ ],
2495
+ "flags": {},
2496
+ "order": 55,
2497
+ "mode": 0,
2498
+ "inputs": [
2499
+ {
2500
+ "name": "",
2501
+ "type": "*",
2502
+ "link": 78
2503
+ }
2504
+ ],
2505
+ "outputs": [
2506
+ {
2507
+ "name": "VAE",
2508
+ "type": "VAE",
2509
+ "links": [
2510
+ 75,
2511
+ 76
2512
+ ],
2513
+ "slot_index": 0
2514
+ }
2515
+ ],
2516
+ "properties": {
2517
+ "showOutputText": true,
2518
+ "horizontal": false
2519
+ }
2520
+ },
2521
+ {
2522
+ "id": 84,
2523
+ "type": "Reroute",
2524
+ "pos": [
2525
+ 1553,
2526
+ 3475
2527
+ ],
2528
+ "size": [
2529
+ 75,
2530
+ 26
2531
+ ],
2532
+ "flags": {},
2533
+ "order": 63,
2534
+ "mode": 0,
2535
+ "inputs": [
2536
+ {
2537
+ "name": "",
2538
+ "type": "*",
2539
+ "link": 76
2540
+ }
2541
+ ],
2542
+ "outputs": [
2543
+ {
2544
+ "name": "VAE",
2545
+ "type": "VAE",
2546
+ "links": [
2547
+ 77
2548
+ ],
2549
+ "slot_index": 0
2550
+ }
2551
+ ],
2552
+ "properties": {
2553
+ "showOutputText": true,
2554
+ "horizontal": false
2555
+ }
2556
+ },
2557
+ {
2558
+ "id": 48,
2559
+ "type": "Text2Image _O",
2560
+ "pos": [
2561
+ 355.59363695640627,
2562
+ 2945.522751555393
2563
+ ],
2564
+ "size": {
2565
+ "0": 400,
2566
+ "1": 436.00006103515625
2567
+ },
2568
+ "flags": {},
2569
+ "order": 75,
2570
+ "mode": 0,
2571
+ "inputs": [
2572
+ {
2573
+ "name": "text",
2574
+ "type": "STRING",
2575
+ "link": 44,
2576
+ "widget": {
2577
+ "name": "text",
2578
+ "config": [
2579
+ "STRING",
2580
+ {
2581
+ "multiline": true
2582
+ }
2583
+ ]
2584
+ }
2585
+ }
2586
+ ],
2587
+ "outputs": [
2588
+ {
2589
+ "name": "IMAGE",
2590
+ "type": "IMAGE",
2591
+ "links": [
2592
+ 45
2593
+ ],
2594
+ "slot_index": 0
2595
+ }
2596
+ ],
2597
+ "properties": {
2598
+ "Node name for S&R": "Text2Image _O"
2599
+ },
2600
+ "widgets_values": [
2601
+ "",
2602
+ "CALIBRI.TTF",
2603
+ 36,
2604
+ 0,
2605
+ 0,
2606
+ 0,
2607
+ 255,
2608
+ 255,
2609
+ 255,
2610
+ 255,
2611
+ 255,
2612
+ 512,
2613
+ 256,
2614
+ "true",
2615
+ 256,
2616
+ 128
2617
+ ],
2618
+ "color": "#232",
2619
+ "bgcolor": "#353"
2620
+ },
2621
+ {
2622
+ "id": 34,
2623
+ "type": "RandomNSP _O",
2624
+ "pos": [
2625
+ 61.593636956406215,
2626
+ 2429.522751555393
2627
+ ],
2628
+ "size": {
2629
+ "0": 315,
2630
+ "1": 106
2631
+ },
2632
+ "flags": {},
2633
+ "order": 33,
2634
+ "mode": 0,
2635
+ "outputs": [
2636
+ {
2637
+ "name": "STRING",
2638
+ "type": "STRING",
2639
+ "links": [
2640
+ 35
2641
+ ],
2642
+ "slot_index": 0
2643
+ }
2644
+ ],
2645
+ "title": "RandomNSP _O (Creature)",
2646
+ "properties": {
2647
+ "Node name for S&R": "RandomNSP _O"
2648
+ },
2649
+ "widgets_values": [
2650
+ "fantasy-creature",
2651
+ 864738385711296,
2652
+ false
2653
+ ],
2654
+ "color": "#232",
2655
+ "bgcolor": "#353"
2656
+ },
2657
+ {
2658
+ "id": 37,
2659
+ "type": "RandomNSP _O",
2660
+ "pos": [
2661
+ 61.602386956406235,
2662
+ 2639.859001555393
2663
+ ],
2664
+ "size": {
2665
+ "0": 315,
2666
+ "1": 106
2667
+ },
2668
+ "flags": {},
2669
+ "order": 34,
2670
+ "mode": 0,
2671
+ "outputs": [
2672
+ {
2673
+ "name": "STRING",
2674
+ "type": "STRING",
2675
+ "links": [
2676
+ 37
2677
+ ],
2678
+ "slot_index": 0
2679
+ }
2680
+ ],
2681
+ "title": "RandomNSP _O (Location)",
2682
+ "properties": {
2683
+ "Node name for S&R": "RandomNSP _O"
2684
+ },
2685
+ "widgets_values": [
2686
+ "pop-location",
2687
+ 837829450938436,
2688
+ false
2689
+ ],
2690
+ "color": "#232",
2691
+ "bgcolor": "#353"
2692
+ }
2693
+ ],
2694
+ "links": [
2695
+ [
2696
+ 2,
2697
+ 3,
2698
+ 0,
2699
+ 4,
2700
+ 0,
2701
+ "STRING"
2702
+ ],
2703
+ [
2704
+ 4,
2705
+ 5,
2706
+ 0,
2707
+ 6,
2708
+ 0,
2709
+ "FLOAT"
2710
+ ],
2711
+ [
2712
+ 5,
2713
+ 6,
2714
+ 0,
2715
+ 7,
2716
+ 0,
2717
+ "INT"
2718
+ ],
2719
+ [
2720
+ 6,
2721
+ 7,
2722
+ 0,
2723
+ 3,
2724
+ 0,
2725
+ "FLOAT"
2726
+ ],
2727
+ [
2728
+ 7,
2729
+ 2,
2730
+ 0,
2731
+ 5,
2732
+ 0,
2733
+ "FLOAT"
2734
+ ],
2735
+ [
2736
+ 8,
2737
+ 8,
2738
+ 0,
2739
+ 12,
2740
+ 0,
2741
+ "STRING"
2742
+ ],
2743
+ [
2744
+ 9,
2745
+ 14,
2746
+ 0,
2747
+ 16,
2748
+ 0,
2749
+ "OPENAI_CHAT_MESSAGES"
2750
+ ],
2751
+ [
2752
+ 10,
2753
+ 15,
2754
+ 0,
2755
+ 16,
2756
+ 1,
2757
+ "OPENAI_CHAT_MESSAGES"
2758
+ ],
2759
+ [
2760
+ 12,
2761
+ 16,
2762
+ 0,
2763
+ 19,
2764
+ 0,
2765
+ "*"
2766
+ ],
2767
+ [
2768
+ 14,
2769
+ 19,
2770
+ 0,
2771
+ 20,
2772
+ 0,
2773
+ "*"
2774
+ ],
2775
+ [
2776
+ 15,
2777
+ 20,
2778
+ 0,
2779
+ 17,
2780
+ 1,
2781
+ "OPENAI_CHAT_MESSAGES"
2782
+ ],
2783
+ [
2784
+ 16,
2785
+ 17,
2786
+ 0,
2787
+ 21,
2788
+ 0,
2789
+ "STRING"
2790
+ ],
2791
+ [
2792
+ 18,
2793
+ 13,
2794
+ 0,
2795
+ 24,
2796
+ 0,
2797
+ "*"
2798
+ ],
2799
+ [
2800
+ 19,
2801
+ 24,
2802
+ 0,
2803
+ 17,
2804
+ 0,
2805
+ "OPENAI"
2806
+ ],
2807
+ [
2808
+ 22,
2809
+ 27,
2810
+ 0,
2811
+ 26,
2812
+ 0,
2813
+ "OPENAI"
2814
+ ],
2815
+ [
2816
+ 23,
2817
+ 24,
2818
+ 0,
2819
+ 28,
2820
+ 0,
2821
+ "*"
2822
+ ],
2823
+ [
2824
+ 24,
2825
+ 28,
2826
+ 0,
2827
+ 27,
2828
+ 0,
2829
+ "*"
2830
+ ],
2831
+ [
2832
+ 25,
2833
+ 26,
2834
+ 0,
2835
+ 30,
2836
+ 0,
2837
+ "IMAGE"
2838
+ ],
2839
+ [
2840
+ 26,
2841
+ 26,
2842
+ 0,
2843
+ 31,
2844
+ 1,
2845
+ "IMAGE"
2846
+ ],
2847
+ [
2848
+ 28,
2849
+ 28,
2850
+ 0,
2851
+ 32,
2852
+ 0,
2853
+ "*"
2854
+ ],
2855
+ [
2856
+ 29,
2857
+ 32,
2858
+ 0,
2859
+ 31,
2860
+ 0,
2861
+ "OPENAI"
2862
+ ],
2863
+ [
2864
+ 30,
2865
+ 31,
2866
+ 0,
2867
+ 33,
2868
+ 0,
2869
+ "IMAGE"
2870
+ ],
2871
+ [
2872
+ 35,
2873
+ 34,
2874
+ 0,
2875
+ 41,
2876
+ 0,
2877
+ "STRING"
2878
+ ],
2879
+ [
2880
+ 36,
2881
+ 41,
2882
+ 0,
2883
+ 39,
2884
+ 0,
2885
+ "STRING"
2886
+ ],
2887
+ [
2888
+ 37,
2889
+ 37,
2890
+ 0,
2891
+ 39,
2892
+ 1,
2893
+ "STRING"
2894
+ ],
2895
+ [
2896
+ 38,
2897
+ 39,
2898
+ 0,
2899
+ 44,
2900
+ 0,
2901
+ "STRING"
2902
+ ],
2903
+ [
2904
+ 39,
2905
+ 44,
2906
+ 0,
2907
+ 45,
2908
+ 0,
2909
+ "STRING"
2910
+ ],
2911
+ [
2912
+ 41,
2913
+ 44,
2914
+ 0,
2915
+ 49,
2916
+ 0,
2917
+ "*"
2918
+ ],
2919
+ [
2920
+ 44,
2921
+ 50,
2922
+ 0,
2923
+ 48,
2924
+ 0,
2925
+ "STRING"
2926
+ ],
2927
+ [
2928
+ 45,
2929
+ 48,
2930
+ 0,
2931
+ 51,
2932
+ 0,
2933
+ "IMAGE"
2934
+ ],
2935
+ [
2936
+ 46,
2937
+ 49,
2938
+ 0,
2939
+ 50,
2940
+ 0,
2941
+ "*"
2942
+ ],
2943
+ [
2944
+ 47,
2945
+ 69,
2946
+ 0,
2947
+ 68,
2948
+ 0,
2949
+ "MODEL"
2950
+ ],
2951
+ [
2952
+ 48,
2953
+ 70,
2954
+ 0,
2955
+ 68,
2956
+ 1,
2957
+ "CONDITIONING"
2958
+ ],
2959
+ [
2960
+ 49,
2961
+ 71,
2962
+ 0,
2963
+ 68,
2964
+ 2,
2965
+ "CONDITIONING"
2966
+ ],
2967
+ [
2968
+ 50,
2969
+ 72,
2970
+ 0,
2971
+ 68,
2972
+ 3,
2973
+ "LATENT"
2974
+ ],
2975
+ [
2976
+ 51,
2977
+ 68,
2978
+ 0,
2979
+ 73,
2980
+ 0,
2981
+ "LATENT"
2982
+ ],
2983
+ [
2984
+ 53,
2985
+ 69,
2986
+ 1,
2987
+ 70,
2988
+ 0,
2989
+ "CLIP"
2990
+ ],
2991
+ [
2992
+ 54,
2993
+ 69,
2994
+ 1,
2995
+ 71,
2996
+ 0,
2997
+ "CLIP"
2998
+ ],
2999
+ [
3000
+ 55,
3001
+ 73,
3002
+ 0,
3003
+ 74,
3004
+ 0,
3005
+ "IMAGE"
3006
+ ],
3007
+ [
3008
+ 56,
3009
+ 68,
3010
+ 0,
3011
+ 66,
3012
+ 0,
3013
+ "LATENT"
3014
+ ],
3015
+ [
3016
+ 57,
3017
+ 66,
3018
+ 0,
3019
+ 75,
3020
+ 0,
3021
+ "LATENT"
3022
+ ],
3023
+ [
3024
+ 58,
3025
+ 75,
3026
+ 0,
3027
+ 76,
3028
+ 0,
3029
+ "IMAGE"
3030
+ ],
3031
+ [
3032
+ 59,
3033
+ 69,
3034
+ 2,
3035
+ 77,
3036
+ 0,
3037
+ "*"
3038
+ ],
3039
+ [
3040
+ 62,
3041
+ 77,
3042
+ 0,
3043
+ 78,
3044
+ 0,
3045
+ "*"
3046
+ ],
3047
+ [
3048
+ 65,
3049
+ 66,
3050
+ 0,
3051
+ 62,
3052
+ 0,
3053
+ "LATENT"
3054
+ ],
3055
+ [
3056
+ 66,
3057
+ 62,
3058
+ 0,
3059
+ 80,
3060
+ 0,
3061
+ "LATENT"
3062
+ ],
3063
+ [
3064
+ 68,
3065
+ 80,
3066
+ 0,
3067
+ 81,
3068
+ 0,
3069
+ "IMAGE"
3070
+ ],
3071
+ [
3072
+ 69,
3073
+ 78,
3074
+ 0,
3075
+ 73,
3076
+ 1,
3077
+ "VAE"
3078
+ ],
3079
+ [
3080
+ 75,
3081
+ 83,
3082
+ 0,
3083
+ 75,
3084
+ 1,
3085
+ "VAE"
3086
+ ],
3087
+ [
3088
+ 76,
3089
+ 83,
3090
+ 0,
3091
+ 84,
3092
+ 0,
3093
+ "*"
3094
+ ],
3095
+ [
3096
+ 77,
3097
+ 84,
3098
+ 0,
3099
+ 80,
3100
+ 1,
3101
+ "VAE"
3102
+ ],
3103
+ [
3104
+ 78,
3105
+ 78,
3106
+ 0,
3107
+ 83,
3108
+ 0,
3109
+ "*"
3110
+ ]
3111
+ ],
3112
+ "groups": [
3113
+ {
3114
+ "title": "Numbers",
3115
+ "bounding": [
3116
+ 0,
3117
+ 0,
3118
+ 1129,
3119
+ 609
3120
+ ],
3121
+ "color": "#8A8"
3122
+ },
3123
+ {
3124
+ "title": "OpenAI",
3125
+ "bounding": [
3126
+ 3,
3127
+ 638,
3128
+ 2030,
3129
+ 1387
3130
+ ],
3131
+ "color": "#3f789e"
3132
+ },
3133
+ {
3134
+ "title": "ChatGPT simple",
3135
+ "bounding": [
3136
+ 462,
3137
+ 690,
3138
+ 1560,
3139
+ 335
3140
+ ],
3141
+ "color": "#88A"
3142
+ },
3143
+ {
3144
+ "title": "ChatGPT",
3145
+ "bounding": [
3146
+ 470,
3147
+ 1196,
3148
+ 1552,
3149
+ 416
3150
+ ],
3151
+ "color": "#88A"
3152
+ },
3153
+ {
3154
+ "title": "Dalle2",
3155
+ "bounding": [
3156
+ 473,
3157
+ 1634,
3158
+ 1550,
3159
+ 372
3160
+ ],
3161
+ "color": "#88A"
3162
+ },
3163
+ {
3164
+ "title": "Text tools",
3165
+ "bounding": [
3166
+ 5,
3167
+ 2042,
3168
+ 2029,
3169
+ 1359
3170
+ ],
3171
+ "color": "#3f789e"
3172
+ },
3173
+ {
3174
+ "title": "Soup Prompts",
3175
+ "bounding": [
3176
+ 32,
3177
+ 2345,
3178
+ 403,
3179
+ 422
3180
+ ],
3181
+ "color": "#8A8"
3182
+ },
3183
+ {
3184
+ "title": "text operations",
3185
+ "bounding": [
3186
+ 469,
3187
+ 2104,
3188
+ 1543,
3189
+ 665
3190
+ ],
3191
+ "color": "#88A"
3192
+ },
3193
+ {
3194
+ "title": "Utility",
3195
+ "bounding": [
3196
+ -4,
3197
+ 4007,
3198
+ 2027,
3199
+ 606
3200
+ ],
3201
+ "color": "#3f789e"
3202
+ },
3203
+ {
3204
+ "title": "Latent tools",
3205
+ "bounding": [
3206
+ 7,
3207
+ 3416,
3208
+ 2024,
3209
+ 579
3210
+ ],
3211
+ "color": "#3f789e"
3212
+ },
3213
+ {
3214
+ "title": "Image tools",
3215
+ "bounding": [
3216
+ 1157,
3217
+ -1,
3218
+ 872,
3219
+ 609
3220
+ ],
3221
+ "color": "#3f789e"
3222
+ },
3223
+ {
3224
+ "title": "generate 4 images",
3225
+ "bounding": [
3226
+ 6,
3227
+ 3451,
3228
+ 510,
3229
+ 543
3230
+ ],
3231
+ "color": "#88A"
3232
+ },
3233
+ {
3234
+ "title": "Group",
3235
+ "bounding": [
3236
+ 73,
3237
+ 3485,
3238
+ 140,
3239
+ 80
3240
+ ],
3241
+ "color": "#3f789e"
3242
+ },
3243
+ {
3244
+ "title": "Group",
3245
+ "bounding": [
3246
+ 53,
3247
+ 3487,
3248
+ 140,
3249
+ 80
3250
+ ],
3251
+ "color": "#3f789e"
3252
+ }
3253
+ ],
3254
+ "config": {},
3255
+ "extra": {},
3256
+ "version": 0.4
3257
+ }