naughtondale commited on
Commit
f47e828
·
1 Parent(s): 406390c

Update app.py

Browse files
Files changed (1) hide show
  1. app.py +0 -178
app.py CHANGED
@@ -54,181 +54,3 @@ chatbot = create_chatbot()
54
 
55
  # Run the chatbot
56
  chatbot.launch()
57
- Use code with caution. Learn more
58
- This code will first create a Chroma database to store the embeddings. Then, it will create a function to create embeddings from a text document. The embeddings will then be saved to the Chroma database.
59
-
60
- The code will then create a chatbot using the Gradio library. The chatbot will have a button that the user can click to start chatting. When the user clicks the button, the chatbot will get the user's input and create embeddings from it. The chatbot will then find the most similar document in the Chroma database and generate a response using the GPT-3.5 Turbo model.
61
-
62
- The chatbot will then display the response to the user. The user can then continue chatting by clicking the button again.
63
-
64
- I hope this helps!
65
-
66
- profile picture
67
- what about the source document that i want to populate the database and create embeddings from?
68
-
69
- The source document is the text that you want to create embeddings from and store in the Chroma database. The source document can be any text file, such as a PDF, Word document, or plain text file.
70
-
71
- To create embeddings from the source document, you can use the create_embeddings() function in the code above. This function will take the text of the source document and create a vector representation of the text. The vector representation is a list of numbers that represent the meaning of the text.
72
-
73
- The vector representation of the text can then be stored in the Chroma database. To do this, you can use the save_embeddings() function in the code above. This function will take the vector representation of the text and store it in the Chroma database.
74
-
75
- Once the embeddings have been stored in the Chroma database, you can then use them to create the chatbot. The chatbot will use the embeddings to find the most similar document in the database and generate a response.
76
-
77
- Here are some things to keep in mind when creating the source document:
78
-
79
- The source document should be well-written and free of errors.
80
- The source document should be relevant to the topic of the chatbot.
81
- The source document should be long enough to generate a variety of embeddings.
82
- I hope this helps!
83
-
84
- profile picture
85
- show me how that source document which you can call 'source.txt' is loaded into the code first and then the embeddings created from it and saved in the chroma database
86
-
87
- Sure, here is how you can load the source document source.txt into the code first and then create embeddings from it and save them in the Chroma database:
88
-
89
- Python
90
- # Load the source document
91
- with open("source.txt", "r") as f:
92
- text = f.read()
93
-
94
- # Create embeddings from the source document
95
- embeddings = create_embeddings(text)
96
-
97
- # Save the embeddings to the Chroma database
98
- save_embeddings(embeddings)
99
- Use code with caution. Learn more
100
- The with open() statement will open the source document and read the text into a variable called text. The create_embeddings() function will then create embeddings from the text and store them in a variable called embeddings. The save_embeddings() function will then store the embeddings in the Chroma database.
101
-
102
- I hope this helps!
103
-
104
- profile picture
105
- please redo the following code you gave earlier with last suggestion, using 'source.txt':
106
- import os
107
- import openai
108
- import gradio as gr
109
- import chroma
110
-
111
- # Set the OpenAI API key
112
- os.environ["OPENAI_API_KEY"] = "YOUR_API_KEY"
113
-
114
- # Create an instance of the OpenAIEmbeddings class
115
- embeddings = openai.OpenAIEmbeddings(model="text-embedding-ada-002")
116
-
117
- # Create a Chroma database
118
- database = chroma.Database("database.db")
119
-
120
- # Function to create embeddings from a text document
121
- def create_embeddings(text):
122
- return embeddings.create_embeddings(text)
123
-
124
- # Function to save embeddings to the Chroma database
125
- def save_embeddings(embeddings):
126
- database.insert_embeddings(embeddings)
127
-
128
- # Function to create a chatbot
129
- def create_chatbot():
130
- engine = gr.engine.Engine(title="Chatbot")
131
-
132
- # Function to handle user input
133
- def handle_input(input_text):
134
- # Get the embeddings for the user input
135
- embeddings = create_embeddings(input_text)
136
-
137
- # Find the most similar document in the database
138
- document = database.find_most_similar_document(embeddings)
139
-
140
- # Generate a response using the GPT-3.5 Turbo model
141
- response = openai.Completion.create(
142
- engine="davinci-codex",
143
- prompt="Generate a response to the query: " + document["text"],
144
- max_tokens=100,
145
- )
146
-
147
- return response["choices"][0]["text"]
148
-
149
- # Create a button to start the chatbot
150
- button = gr.Button(label="Start Chatting", description="Click to start chatting", on_click=handle_input)
151
-
152
- # Add the button to the chatbot
153
- engine.add_element(button)
154
-
155
- return engine
156
-
157
- # Create the chatbot
158
- chatbot = create_chatbot()
159
-
160
- # Run the chatbot
161
- chatbot.launch()
162
-
163
- Sure, here is the code with the last suggestion:
164
-
165
- Python
166
- import os
167
- import openai
168
- import gradio as gr
169
- import chroma
170
-
171
- # Set the OpenAI API key
172
- os.environ["OPENAI_API_KEY"] = "YOUR_API_KEY"
173
-
174
- # Create an instance of the OpenAIEmbeddings class
175
- embeddings = openai.OpenAIEmbeddings(model="text-embedding-ada-002")
176
-
177
- # Create a Chroma database
178
- database = chroma.Database("database.db")
179
-
180
- # Function to create embeddings from a text document
181
- def create_embeddings(text):
182
- return embeddings.create_embeddings(text)
183
-
184
- # Function to save embeddings to the Chroma database
185
- def save_embeddings(embeddings):
186
- database.insert_embeddings(embeddings)
187
-
188
- # Function to load the source document
189
- def load_source_document():
190
- with open("source.txt", "r") as f:
191
- text = f.read()
192
- return text
193
-
194
- # Function to create a chatbot
195
- def create_chatbot():
196
- engine = gr.engine.Engine(title="Chatbot")
197
-
198
- # Function to handle user input
199
- def handle_input(input_text):
200
- # Get the embeddings for the user input
201
- embeddings = create_embeddings(input_text)
202
-
203
- # Find the most similar document in the database
204
- document = database.find_most_similar_document(embeddings)
205
-
206
- # Generate a response using the GPT-3.5 Turbo model
207
- response = openai.Completion.create(
208
- engine="davinci-codex",
209
- prompt="Generate a response to the query: " + document["text"],
210
- max_tokens=100,
211
- )
212
-
213
- return response["choices"][0]["text"]
214
-
215
- # Load the source document
216
- text = load_source_document()
217
-
218
- # Create embeddings from the source document and save them in the Chroma database
219
- embeddings = create_embeddings(text)
220
- save_embeddings(embeddings)
221
-
222
- # Create a button to start the chatbot
223
- button = gr.Button(label="Start Chatting", description="Click to start chatting", on_click=handle_input)
224
-
225
- # Add the button to the chatbot
226
- engine.add_element(button)
227
-
228
- return engine
229
-
230
- # Create the chatbot
231
- chatbot = create_chatbot()
232
-
233
- # Run the chatbot
234
- chatbot.launch()
 
54
 
55
  # Run the chatbot
56
  chatbot.launch()