Proper Doc
Browse files
README.md
CHANGED
|
@@ -1,7 +1,6 @@
|
|
| 1 |
---
|
| 2 |
license: mit
|
| 3 |
---
|
| 4 |
-
|
| 5 |
# Table of Contents
|
| 6 |
|
| 7 |
* [ChatAtomicFlow](#ChatAtomicFlow)
|
|
@@ -9,6 +8,7 @@ license: mit
|
|
| 9 |
* [set\_up\_flow\_state](#ChatAtomicFlow.ChatAtomicFlow.set_up_flow_state)
|
| 10 |
* [instantiate\_from\_config](#ChatAtomicFlow.ChatAtomicFlow.instantiate_from_config)
|
| 11 |
* [get\_interface\_description](#ChatAtomicFlow.ChatAtomicFlow.get_interface_description)
|
|
|
|
| 12 |
* [run](#ChatAtomicFlow.ChatAtomicFlow.run)
|
| 13 |
|
| 14 |
<a id="ChatAtomicFlow"></a>
|
|
@@ -136,15 +136,15 @@ This method returns the description of the flow's input and output interface.
|
|
| 136 |
|
| 137 |
`Dict[str, Any]`: The description of the flow's interface.
|
| 138 |
|
| 139 |
-
<a id="ChatAtomicFlow.ChatAtomicFlow.
|
| 140 |
|
| 141 |
-
####
|
| 142 |
|
| 143 |
```python
|
| 144 |
-
def
|
| 145 |
```
|
| 146 |
|
| 147 |
-
This method
|
| 148 |
|
| 149 |
**Arguments**:
|
| 150 |
|
|
@@ -152,5 +152,19 @@ This method runs the flow. It processes the input, calls the backend and updates
|
|
| 152 |
|
| 153 |
**Returns**:
|
| 154 |
|
| 155 |
-
`
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 156 |
|
|
|
|
| 1 |
---
|
| 2 |
license: mit
|
| 3 |
---
|
|
|
|
| 4 |
# Table of Contents
|
| 5 |
|
| 6 |
* [ChatAtomicFlow](#ChatAtomicFlow)
|
|
|
|
| 8 |
* [set\_up\_flow\_state](#ChatAtomicFlow.ChatAtomicFlow.set_up_flow_state)
|
| 9 |
* [instantiate\_from\_config](#ChatAtomicFlow.ChatAtomicFlow.instantiate_from_config)
|
| 10 |
* [get\_interface\_description](#ChatAtomicFlow.ChatAtomicFlow.get_interface_description)
|
| 11 |
+
* [query\_llm](#ChatAtomicFlow.ChatAtomicFlow.query_llm)
|
| 12 |
* [run](#ChatAtomicFlow.ChatAtomicFlow.run)
|
| 13 |
|
| 14 |
<a id="ChatAtomicFlow"></a>
|
|
|
|
| 136 |
|
| 137 |
`Dict[str, Any]`: The description of the flow's interface.
|
| 138 |
|
| 139 |
+
<a id="ChatAtomicFlow.ChatAtomicFlow.query_llm"></a>
|
| 140 |
|
| 141 |
+
#### query\_llm
|
| 142 |
|
| 143 |
```python
|
| 144 |
+
def query_llm(input_data: Dict[str, Any])
|
| 145 |
```
|
| 146 |
|
| 147 |
+
This method queries the LLM. It processes the input, calls the backend and returns the response.
|
| 148 |
|
| 149 |
**Arguments**:
|
| 150 |
|
|
|
|
| 152 |
|
| 153 |
**Returns**:
|
| 154 |
|
| 155 |
+
`Union[str, List[str]]`: The response of the LLM to the input.
|
| 156 |
+
|
| 157 |
+
<a id="ChatAtomicFlow.ChatAtomicFlow.run"></a>
|
| 158 |
+
|
| 159 |
+
#### run
|
| 160 |
+
|
| 161 |
+
```python
|
| 162 |
+
def run(input_message: FlowMessage)
|
| 163 |
+
```
|
| 164 |
+
|
| 165 |
+
This method runs the flow. It processes the input, calls the backend and updates the state of the flow.
|
| 166 |
+
|
| 167 |
+
**Arguments**:
|
| 168 |
+
|
| 169 |
+
- `input_message` (`aiflows.messages.FlowMessage`): The input data of the flow.
|
| 170 |
|