Commit
Β·
8c0ec2b
1
Parent(s):
534dece
fix(P0): Implement proper AIFunction serialization for HuggingFace
Browse filesRoot cause: AIFunction objects from Microsoft agent-framework were
passed directly to HuggingFace InferenceClient, causing JSON
serialization errors.
Fix:
- Add _convert_tools() to convert AIFunction β OpenAI-compatible JSON
- Add _parse_tool_calls() to convert HF response β FunctionCallContent
- Update both sync and streaming response methods
Verified:
- 307 tests pass (make check)
- Tool serialization: 3 AIFunction β 3 JSON dicts (2128 bytes)
- End-to-end research completes successfully
Closes P0 AIFunction serialization bug for Free Tier.
docs/bugs/P0_AIFUNCTION_NOT_JSON_SERIALIZABLE.md
CHANGED
|
@@ -1,7 +1,7 @@
|
|
| 1 |
# P0 Bug: AIFunction Not JSON Serializable (Free Tier Broken)
|
| 2 |
|
| 3 |
**Severity**: P0 (Critical) - Free Tier cannot perform research
|
| 4 |
-
**Status**:
|
| 5 |
**Discovered**: 2025-12-01
|
| 6 |
**Reporter**: Production user via HuggingFace Spaces
|
| 7 |
|
|
@@ -47,14 +47,6 @@ TypeError: Object of type AIFunction is not JSON serializable
|
|
| 47 |
4. `requests.post()` internally calls `json.dumps()` on the request body
|
| 48 |
5. `AIFunction` has no `__json__()` method or isn't a dict β TypeError
|
| 49 |
|
| 50 |
-
### The Warning We Ignored
|
| 51 |
-
|
| 52 |
-
The agent framework already warned us:
|
| 53 |
-
```
|
| 54 |
-
[WARNING] The provided chat client does not support function invoking,
|
| 55 |
-
this might limit agent capabilities.
|
| 56 |
-
```
|
| 57 |
-
|
| 58 |
## Impact
|
| 59 |
|
| 60 |
| Component | Impact |
|
|
@@ -63,104 +55,107 @@ this might limit agent capabilities.
|
|
| 63 |
| Advanced Mode without API key | **Cannot do research** |
|
| 64 |
| Paid Tier (OpenAI) | Unaffected (OpenAI handles AIFunction) |
|
| 65 |
|
| 66 |
-
##
|
| 67 |
-
|
| 68 |
-
### Option 1: Disable Tools for HuggingFace (QUICK FIX)
|
| 69 |
-
|
| 70 |
-
Pass `tools=None` to disable function calling entirely:
|
| 71 |
-
|
| 72 |
-
```python
|
| 73 |
-
# src/clients/huggingface.py
|
| 74 |
-
|
| 75 |
-
async def _inner_get_response(self, ...):
|
| 76 |
-
hf_messages = self._convert_messages(messages)
|
| 77 |
|
| 78 |
-
|
| 79 |
-
# The agents will use natural language instructions instead
|
| 80 |
-
tools = None # Was: chat_options.tools if chat_options.tools else None
|
| 81 |
-
hf_tool_choice = None
|
| 82 |
-
...
|
| 83 |
-
```
|
| 84 |
|
| 85 |
-
**
|
| 86 |
-
|
| 87 |
-
- No serialization errors
|
| 88 |
-
- Agents still work via natural language instructions
|
| 89 |
|
| 90 |
-
|
| 91 |
-
- Agents can't use structured tool calls
|
| 92 |
-
- Less precise than function calling
|
| 93 |
-
- Qwen2.5-72B DOES support function calling (we're not using it)
|
| 94 |
-
|
| 95 |
-
### Option 2: Convert AIFunction to JSON Schema (PROPER FIX)
|
| 96 |
-
|
| 97 |
-
Serialize `AIFunction` objects to OpenAI-compatible tool format:
|
| 98 |
|
| 99 |
```python
|
| 100 |
def _convert_tools(self, tools: list[Any] | None) -> list[dict[str, Any]] | None:
|
| 101 |
-
"""Convert AIFunction objects to
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 102 |
if not tools:
|
| 103 |
return None
|
| 104 |
|
| 105 |
json_tools = []
|
| 106 |
for tool in tools:
|
| 107 |
if hasattr(tool, 'to_dict'):
|
| 108 |
-
|
| 109 |
-
json_tools.append(tool.to_dict())
|
| 110 |
-
elif hasattr(tool, 'schema'):
|
| 111 |
-
# Alternative: use schema property
|
| 112 |
json_tools.append({
|
| 113 |
"type": "function",
|
| 114 |
"function": {
|
| 115 |
-
"name":
|
| 116 |
-
"description":
|
| 117 |
-
"parameters":
|
| 118 |
}
|
| 119 |
})
|
|
|
|
|
|
|
| 120 |
else:
|
| 121 |
-
# Fallback: skip unknown tool types
|
| 122 |
logger.warning(f"Skipping non-serializable tool: {type(tool)}")
|
| 123 |
|
| 124 |
return json_tools if json_tools else None
|
| 125 |
```
|
| 126 |
|
| 127 |
-
|
| 128 |
-
- Proper function calling with Qwen2.5
|
| 129 |
-
- Structured tool invocation
|
| 130 |
-
- Better agent capabilities
|
| 131 |
-
|
| 132 |
-
**Cons**:
|
| 133 |
-
- More complex
|
| 134 |
-
- Need to handle tool call responses
|
| 135 |
-
- May require testing with different HF models
|
| 136 |
|
| 137 |
-
|
| 138 |
-
|
| 139 |
-
Try to convert tools, fall back to None if it fails:
|
| 140 |
|
| 141 |
```python
|
| 142 |
-
|
| 143 |
-
|
| 144 |
-
|
| 145 |
-
|
| 146 |
-
|
| 147 |
-
|
| 148 |
-
|
| 149 |
-
|
| 150 |
-
|
| 151 |
-
|
| 152 |
-
|
| 153 |
-
|
| 154 |
-
|
| 155 |
-
|
| 156 |
-
|
| 157 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 158 |
```
|
| 159 |
|
| 160 |
-
|
| 161 |
|
| 162 |
-
|
| 163 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 164 |
|
| 165 |
## Call Stack Trace
|
| 166 |
|
|
@@ -197,18 +192,17 @@ from src.orchestrators.advanced import AdvancedOrchestrator
|
|
| 197 |
async def test():
|
| 198 |
orch = AdvancedOrchestrator(max_rounds=2)
|
| 199 |
async for event in orch.run('testosterone benefits'):
|
| 200 |
-
print(f'[{event.type}] {event.message[:50]}...')
|
| 201 |
|
| 202 |
asyncio.run(test())
|
| 203 |
"
|
| 204 |
|
| 205 |
-
# Expected: TypeError: Object of type AIFunction is not JSON serializable
|
| 206 |
-
#
|
| 207 |
```
|
| 208 |
|
| 209 |
## References
|
| 210 |
|
| 211 |
-
- [
|
| 212 |
-
- [HuggingFace Chat Completion API](https://huggingface.co/docs/api-inference/en/tasks/chat-completion)
|
| 213 |
- [Qwen Function Calling](https://qwen.readthedocs.io/en/latest/framework/function_call.html)
|
| 214 |
-
- [
|
|
|
|
| 1 |
# P0 Bug: AIFunction Not JSON Serializable (Free Tier Broken)
|
| 2 |
|
| 3 |
**Severity**: P0 (Critical) - Free Tier cannot perform research
|
| 4 |
+
**Status**: In Progress
|
| 5 |
**Discovered**: 2025-12-01
|
| 6 |
**Reporter**: Production user via HuggingFace Spaces
|
| 7 |
|
|
|
|
| 47 |
4. `requests.post()` internally calls `json.dumps()` on the request body
|
| 48 |
5. `AIFunction` has no `__json__()` method or isn't a dict β TypeError
|
| 49 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 50 |
## Impact
|
| 51 |
|
| 52 |
| Component | Impact |
|
|
|
|
| 55 |
| Advanced Mode without API key | **Cannot do research** |
|
| 56 |
| Paid Tier (OpenAI) | Unaffected (OpenAI handles AIFunction) |
|
| 57 |
|
| 58 |
+
## Professional Fix (Full Implementation)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 59 |
|
| 60 |
+
Qwen2.5-72B-Instruct **SUPPORTS** function calling via HuggingFace. The fix requires:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 61 |
|
| 62 |
+
1. **Request Serialization**: Convert `AIFunction` β OpenAI-compatible JSON
|
| 63 |
+
2. **Response Parsing**: Convert HuggingFace `tool_calls` β Framework `FunctionCallContent`
|
|
|
|
|
|
|
| 64 |
|
| 65 |
+
### Part 1: Tool Serialization (`_convert_tools`)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 66 |
|
| 67 |
```python
|
| 68 |
def _convert_tools(self, tools: list[Any] | None) -> list[dict[str, Any]] | None:
|
| 69 |
+
"""Convert AIFunction objects to OpenAI-compatible tool definitions.
|
| 70 |
+
|
| 71 |
+
AIFunction.to_dict() returns:
|
| 72 |
+
{'type': 'ai_function', 'name': '...', 'description': '...', 'input_model': {...}}
|
| 73 |
+
|
| 74 |
+
OpenAI/HuggingFace expects:
|
| 75 |
+
{'type': 'function', 'function': {'name': '...', 'description': '...', 'parameters': {...}}}
|
| 76 |
+
"""
|
| 77 |
if not tools:
|
| 78 |
return None
|
| 79 |
|
| 80 |
json_tools = []
|
| 81 |
for tool in tools:
|
| 82 |
if hasattr(tool, 'to_dict'):
|
| 83 |
+
t_dict = tool.to_dict()
|
|
|
|
|
|
|
|
|
|
| 84 |
json_tools.append({
|
| 85 |
"type": "function",
|
| 86 |
"function": {
|
| 87 |
+
"name": t_dict["name"],
|
| 88 |
+
"description": t_dict.get("description", ""),
|
| 89 |
+
"parameters": t_dict["input_model"]
|
| 90 |
}
|
| 91 |
})
|
| 92 |
+
elif isinstance(tool, dict):
|
| 93 |
+
json_tools.append(tool)
|
| 94 |
else:
|
|
|
|
| 95 |
logger.warning(f"Skipping non-serializable tool: {type(tool)}")
|
| 96 |
|
| 97 |
return json_tools if json_tools else None
|
| 98 |
```
|
| 99 |
|
| 100 |
+
### Part 2: Response Parsing (Tool Calls β FunctionCallContent)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 101 |
|
| 102 |
+
When HuggingFace returns tool calls, we must convert them to the framework's format:
|
|
|
|
|
|
|
| 103 |
|
| 104 |
```python
|
| 105 |
+
from agent_framework._types import FunctionCallContent
|
| 106 |
+
|
| 107 |
+
# In _inner_get_response, after getting the response:
|
| 108 |
+
choice = choices[0]
|
| 109 |
+
message = choice.message
|
| 110 |
+
message_content = message.content or ""
|
| 111 |
+
|
| 112 |
+
# Parse tool calls if present
|
| 113 |
+
contents: list[Any] = []
|
| 114 |
+
if hasattr(message, 'tool_calls') and message.tool_calls:
|
| 115 |
+
for tc in message.tool_calls:
|
| 116 |
+
# HF returns: tc.id, tc.function.name, tc.function.arguments
|
| 117 |
+
contents.append(FunctionCallContent(
|
| 118 |
+
call_id=tc.id,
|
| 119 |
+
name=tc.function.name,
|
| 120 |
+
arguments=tc.function.arguments # JSON string or dict
|
| 121 |
+
))
|
| 122 |
+
|
| 123 |
+
response_msg = ChatMessage(
|
| 124 |
+
role=cast(Any, message.role),
|
| 125 |
+
text=message_content,
|
| 126 |
+
contents=contents if contents else None
|
| 127 |
+
)
|
| 128 |
```
|
| 129 |
|
| 130 |
+
### Verified Schema Mapping
|
| 131 |
|
| 132 |
+
```python
|
| 133 |
+
# AIFunction.to_dict() output (verified 2025-12-01):
|
| 134 |
+
{
|
| 135 |
+
"type": "ai_function",
|
| 136 |
+
"name": "search_pubmed",
|
| 137 |
+
"description": "Search PubMed for biomedical research papers...",
|
| 138 |
+
"input_model": {
|
| 139 |
+
"properties": {"query": {"title": "Query", "type": "string"}, ...},
|
| 140 |
+
"required": ["query"],
|
| 141 |
+
"type": "object"
|
| 142 |
+
}
|
| 143 |
+
}
|
| 144 |
+
|
| 145 |
+
# Mapped to OpenAI format:
|
| 146 |
+
{
|
| 147 |
+
"type": "function",
|
| 148 |
+
"function": {
|
| 149 |
+
"name": "search_pubmed",
|
| 150 |
+
"description": "Search PubMed for biomedical research papers...",
|
| 151 |
+
"parameters": {
|
| 152 |
+
"properties": {"query": {"title": "Query", "type": "string"}, ...},
|
| 153 |
+
"required": ["query"],
|
| 154 |
+
"type": "object"
|
| 155 |
+
}
|
| 156 |
+
}
|
| 157 |
+
}
|
| 158 |
+
```
|
| 159 |
|
| 160 |
## Call Stack Trace
|
| 161 |
|
|
|
|
| 192 |
async def test():
|
| 193 |
orch = AdvancedOrchestrator(max_rounds=2)
|
| 194 |
async for event in orch.run('testosterone benefits'):
|
| 195 |
+
print(f'[{event.type}] {str(event.message)[:50]}...')
|
| 196 |
|
| 197 |
asyncio.run(test())
|
| 198 |
"
|
| 199 |
|
| 200 |
+
# Expected BEFORE fix: TypeError: Object of type AIFunction is not JSON serializable
|
| 201 |
+
# Expected AFTER fix: Research completes with tool calls working
|
| 202 |
```
|
| 203 |
|
| 204 |
## References
|
| 205 |
|
| 206 |
+
- [HuggingFace Chat Completion - Function Calling](https://huggingface.co/docs/inference-providers/tasks/chat-completion)
|
|
|
|
| 207 |
- [Qwen Function Calling](https://qwen.readthedocs.io/en/latest/framework/function_call.html)
|
| 208 |
+
- [Microsoft Agent Framework - AIFunction](https://learn.microsoft.com/en-us/python/api/agent-framework-core/agent_framework.aifunction)
|
src/clients/huggingface.py
CHANGED
|
@@ -18,6 +18,7 @@ from agent_framework import (
|
|
| 18 |
ChatResponse,
|
| 19 |
ChatResponseUpdate,
|
| 20 |
)
|
|
|
|
| 21 |
from huggingface_hub import InferenceClient
|
| 22 |
|
| 23 |
from src.utils.config import settings
|
|
@@ -26,7 +27,7 @@ logger = structlog.get_logger()
|
|
| 26 |
|
| 27 |
|
| 28 |
class HuggingFaceChatClient(BaseChatClient): # type: ignore[misc]
|
| 29 |
-
"""Adapter for HuggingFace Inference API."""
|
| 30 |
|
| 31 |
def __init__(
|
| 32 |
self,
|
|
@@ -69,6 +70,69 @@ class HuggingFaceChatClient(BaseChatClient): # type: ignore[misc]
|
|
| 69 |
hf_messages.append({"role": role_str, "content": content})
|
| 70 |
return hf_messages
|
| 71 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 72 |
async def _inner_get_response(
|
| 73 |
self,
|
| 74 |
*,
|
|
@@ -79,12 +143,13 @@ class HuggingFaceChatClient(BaseChatClient): # type: ignore[misc]
|
|
| 79 |
"""Synchronous response generation using chat_completion."""
|
| 80 |
hf_messages = self._convert_messages(messages)
|
| 81 |
|
| 82 |
-
#
|
| 83 |
-
tools = chat_options.tools if chat_options.tools else None
|
|
|
|
| 84 |
# HF expects 'tool_choice' to be 'auto', 'none', or specific tool
|
| 85 |
# Framework uses ToolMode enum or dict
|
| 86 |
hf_tool_choice: str | None = None
|
| 87 |
-
if chat_options.tool_choice is not None:
|
| 88 |
tool_choice_str = str(chat_options.tool_choice)
|
| 89 |
if "AUTO" in tool_choice_str:
|
| 90 |
hf_tool_choice = "auto"
|
|
@@ -116,12 +181,17 @@ class HuggingFaceChatClient(BaseChatClient): # type: ignore[misc]
|
|
| 116 |
return ChatResponse(messages=[], response_id="error-no-choices")
|
| 117 |
|
| 118 |
choice = choices[0]
|
| 119 |
-
|
|
|
|
| 120 |
|
| 121 |
-
#
|
|
|
|
|
|
|
|
|
|
| 122 |
response_msg = ChatMessage(
|
| 123 |
-
role=cast(Any,
|
| 124 |
text=message_content,
|
|
|
|
| 125 |
)
|
| 126 |
|
| 127 |
return ChatResponse(
|
|
@@ -143,9 +213,11 @@ class HuggingFaceChatClient(BaseChatClient): # type: ignore[misc]
|
|
| 143 |
"""Streaming response generation."""
|
| 144 |
hf_messages = self._convert_messages(messages)
|
| 145 |
|
| 146 |
-
|
|
|
|
|
|
|
| 147 |
hf_tool_choice: str | None = None
|
| 148 |
-
if chat_options.tool_choice is not None:
|
| 149 |
if "AUTO" in str(chat_options.tool_choice):
|
| 150 |
hf_tool_choice = "auto"
|
| 151 |
|
|
|
|
| 18 |
ChatResponse,
|
| 19 |
ChatResponseUpdate,
|
| 20 |
)
|
| 21 |
+
from agent_framework._types import FunctionCallContent
|
| 22 |
from huggingface_hub import InferenceClient
|
| 23 |
|
| 24 |
from src.utils.config import settings
|
|
|
|
| 27 |
|
| 28 |
|
| 29 |
class HuggingFaceChatClient(BaseChatClient): # type: ignore[misc]
|
| 30 |
+
"""Adapter for HuggingFace Inference API with full function calling support."""
|
| 31 |
|
| 32 |
def __init__(
|
| 33 |
self,
|
|
|
|
| 70 |
hf_messages.append({"role": role_str, "content": content})
|
| 71 |
return hf_messages
|
| 72 |
|
| 73 |
+
def _convert_tools(self, tools: list[Any] | None) -> list[dict[str, Any]] | None:
|
| 74 |
+
"""Convert AIFunction objects to OpenAI-compatible tool definitions.
|
| 75 |
+
|
| 76 |
+
AIFunction.to_dict() returns:
|
| 77 |
+
{'type': 'ai_function', 'name': '...', 'input_model': {...}}
|
| 78 |
+
|
| 79 |
+
OpenAI/HuggingFace expects:
|
| 80 |
+
{'type': 'function', 'function': {'name': '...', 'parameters': {...}}}
|
| 81 |
+
"""
|
| 82 |
+
if not tools:
|
| 83 |
+
return None
|
| 84 |
+
|
| 85 |
+
json_tools = []
|
| 86 |
+
for tool in tools:
|
| 87 |
+
if hasattr(tool, "to_dict"):
|
| 88 |
+
try:
|
| 89 |
+
t_dict = tool.to_dict()
|
| 90 |
+
json_tools.append(
|
| 91 |
+
{
|
| 92 |
+
"type": "function",
|
| 93 |
+
"function": {
|
| 94 |
+
"name": t_dict["name"],
|
| 95 |
+
"description": t_dict.get("description", ""),
|
| 96 |
+
"parameters": t_dict["input_model"],
|
| 97 |
+
},
|
| 98 |
+
}
|
| 99 |
+
)
|
| 100 |
+
except (KeyError, TypeError) as e:
|
| 101 |
+
logger.warning("Failed to convert tool", tool=str(tool), error=str(e))
|
| 102 |
+
elif isinstance(tool, dict):
|
| 103 |
+
# Already a dict - assume correct format
|
| 104 |
+
json_tools.append(tool)
|
| 105 |
+
else:
|
| 106 |
+
logger.warning("Skipping non-serializable tool", tool_type=str(type(tool)))
|
| 107 |
+
|
| 108 |
+
return json_tools if json_tools else None
|
| 109 |
+
|
| 110 |
+
def _parse_tool_calls(self, message: Any) -> list[FunctionCallContent]:
|
| 111 |
+
"""Parse HuggingFace tool_calls into framework FunctionCallContent.
|
| 112 |
+
|
| 113 |
+
HF returns tool_calls as:
|
| 114 |
+
[ChatCompletionOutputToolCall(id='...', function=ChatCompletionOutputFunctionDefinition(
|
| 115 |
+
name='...', arguments='{"key": "value"}'), type='function')]
|
| 116 |
+
"""
|
| 117 |
+
contents: list[FunctionCallContent] = []
|
| 118 |
+
|
| 119 |
+
if not hasattr(message, "tool_calls") or not message.tool_calls:
|
| 120 |
+
return contents
|
| 121 |
+
|
| 122 |
+
for tc in message.tool_calls:
|
| 123 |
+
try:
|
| 124 |
+
contents.append(
|
| 125 |
+
FunctionCallContent(
|
| 126 |
+
call_id=tc.id,
|
| 127 |
+
name=tc.function.name,
|
| 128 |
+
arguments=tc.function.arguments, # JSON string or dict
|
| 129 |
+
)
|
| 130 |
+
)
|
| 131 |
+
except (AttributeError, TypeError) as e:
|
| 132 |
+
logger.warning("Failed to parse tool call", error=str(e))
|
| 133 |
+
|
| 134 |
+
return contents
|
| 135 |
+
|
| 136 |
async def _inner_get_response(
|
| 137 |
self,
|
| 138 |
*,
|
|
|
|
| 143 |
"""Synchronous response generation using chat_completion."""
|
| 144 |
hf_messages = self._convert_messages(messages)
|
| 145 |
|
| 146 |
+
# Convert AIFunction objects to OpenAI-compatible JSON
|
| 147 |
+
tools = self._convert_tools(chat_options.tools if chat_options.tools else None)
|
| 148 |
+
|
| 149 |
# HF expects 'tool_choice' to be 'auto', 'none', or specific tool
|
| 150 |
# Framework uses ToolMode enum or dict
|
| 151 |
hf_tool_choice: str | None = None
|
| 152 |
+
if tools and chat_options.tool_choice is not None:
|
| 153 |
tool_choice_str = str(chat_options.tool_choice)
|
| 154 |
if "AUTO" in tool_choice_str:
|
| 155 |
hf_tool_choice = "auto"
|
|
|
|
| 181 |
return ChatResponse(messages=[], response_id="error-no-choices")
|
| 182 |
|
| 183 |
choice = choices[0]
|
| 184 |
+
message = choice.message
|
| 185 |
+
message_content = message.content or ""
|
| 186 |
|
| 187 |
+
# Parse tool calls if present
|
| 188 |
+
tool_call_contents = self._parse_tool_calls(message)
|
| 189 |
+
|
| 190 |
+
# Construct response message with tool calls in contents
|
| 191 |
response_msg = ChatMessage(
|
| 192 |
+
role=cast(Any, message.role),
|
| 193 |
text=message_content,
|
| 194 |
+
contents=tool_call_contents if tool_call_contents else None,
|
| 195 |
)
|
| 196 |
|
| 197 |
return ChatResponse(
|
|
|
|
| 213 |
"""Streaming response generation."""
|
| 214 |
hf_messages = self._convert_messages(messages)
|
| 215 |
|
| 216 |
+
# Convert AIFunction objects to OpenAI-compatible JSON
|
| 217 |
+
tools = self._convert_tools(chat_options.tools if chat_options.tools else None)
|
| 218 |
+
|
| 219 |
hf_tool_choice: str | None = None
|
| 220 |
+
if tools and chat_options.tool_choice is not None:
|
| 221 |
if "AUTO" in str(chat_options.tool_choice):
|
| 222 |
hf_tool_choice = "auto"
|
| 223 |
|