mrfirdauss commited on
Commit
58ea756
·
1 Parent(s): 620ca98

fix: save as plot, persistr

Browse files
Files changed (3) hide show
  1. README.md +6 -0
  2. pdf_question.png +0 -0
  3. src/FinancialAgentApp.py +14 -5
README.md CHANGED
@@ -12,6 +12,9 @@ short_description: openchat for financial creditcard fraud
12
  ---
13
 
14
  # Financial RAG
 
 
 
15
 
16
  ## Model Selection:
17
  In this task i select **decoder-only** model, because we want them as conversation model. Not specific task. To avoid hallucination, we add RAG as context.
@@ -30,6 +33,9 @@ I choose `sentence-transformers/all-MiniLM-L6-v2` embedding model because it hav
30
  ## Ollama API
31
  As alternative we provie self hosted ollama api on hugginface. This API already enabled with "qwen3-4b" model. [link](https://mrfirdauss-ollama-api.hf.space)
32
 
 
 
 
33
  ## Schema
34
  ```
35
  User Input
 
12
  ---
13
 
14
  # Financial RAG
15
+ ![Data Frame Question](./data_plot.png)
16
+
17
+ ![PDF Paper Question](./pdf_question.png)
18
 
19
  ## Model Selection:
20
  In this task i select **decoder-only** model, because we want them as conversation model. Not specific task. To avoid hallucination, we add RAG as context.
 
33
  ## Ollama API
34
  As alternative we provie self hosted ollama api on hugginface. This API already enabled with "qwen3-4b" model. [link](https://mrfirdauss-ollama-api.hf.space)
35
 
36
+ ## Design Pattern
37
+ I apply **`Factory Design Pattern`**, because we want all library we use have same method, even originally the method of the library was different. Thus we create a class as Factory Abstract Class.
38
+
39
  ## Schema
40
  ```
41
  User Input
pdf_question.png ADDED
src/FinancialAgentApp.py CHANGED
@@ -23,13 +23,19 @@ class FinancialAgentApp (ABC):
23
  def render_header(self):
24
  self.st.title("Financial Agent")
25
 
 
26
  def render_messages(self):
27
- """Render previous chat messages."""
28
  for message in self.st.session_state.messages:
 
 
29
  if message.get("type") == "plot":
30
- self.st.image(message["content"])
 
31
  else:
32
- self.st.markdown(message["content"])
 
 
33
 
34
  @abstractmethod
35
  def __stream_answer__(self, instructions, input_messages):
@@ -82,12 +88,15 @@ class FinancialAgentApp (ABC):
82
  if fig.get_axes(): # if a chart was generated
83
  with self.st.chat_message("assistant"):
84
  self.st.pyplot(fig)
85
- plt.close(fig)
 
86
  self.st.session_state.messages.append({
87
  "role": "assistant",
88
  "type": "plot",
89
- "content": self.__safe_savefig__()
90
  })
 
 
91
 
92
  context_prompt = "## CONTEXT DATAFRAME.\n"
93
  context_prompt += str(local_scope.get("result", ""))
 
23
  def render_header(self):
24
  self.st.title("Financial Agent")
25
 
26
+
27
  def render_messages(self):
28
+ """Render previous chat messages with roles."""
29
  for message in self.st.session_state.messages:
30
+ role = message.get("role", "assistant") # default to assistant if missing
31
+
32
  if message.get("type") == "plot":
33
+ with self.st.chat_message(role):
34
+ self.st.pyplot(message["content"])
35
  else:
36
+ with self.st.chat_message(role):
37
+ self.st.markdown(message["content"])
38
+
39
 
40
  @abstractmethod
41
  def __stream_answer__(self, instructions, input_messages):
 
88
  if fig.get_axes(): # if a chart was generated
89
  with self.st.chat_message("assistant"):
90
  self.st.pyplot(fig)
91
+ buf = self.__safe_savefig__() # BytesIO PNG
92
+ # Add the plot as a chat message in session state
93
  self.st.session_state.messages.append({
94
  "role": "assistant",
95
  "type": "plot",
96
+ "content": buf
97
  })
98
+ plt.close(fig)
99
+
100
 
101
  context_prompt = "## CONTEXT DATAFRAME.\n"
102
  context_prompt += str(local_scope.get("result", ""))