File size: 2,514 Bytes
61411b5
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
from __future__ import annotations

import json
import logging
import os
from pathlib import Path
from datetime import datetime, timezone
from typing import Any, Dict, Optional, Tuple

from dotenv import load_dotenv


def load_environment() -> None:
    """Load environment variables from a local .env if present."""

    # Prefer the project-local .env at ai_business_automation_agent/.env.
    # This avoids surprises when Streamlit's working directory differs.
    project_env = Path(__file__).resolve().parent / ".env"
    # Use override=True to ensure .env values replace empty process env vars.
    if project_env.exists():
        load_dotenv(dotenv_path=project_env, override=True)
    else:
        load_dotenv(override=True)


def setup_logging() -> None:
    level = os.getenv("LOG_LEVEL", "INFO").upper().strip()
    logging.basicConfig(
        level=level,
        format="%(asctime)s | %(levelname)s | %(name)s | %(message)s",
    )


def utc_now_iso() -> str:
    return datetime.now(timezone.utc).isoformat()


def _extract_first_json_object(text: str) -> Optional[str]:
    """Best-effort extraction of first top-level JSON object from text."""

    start = text.find("{")
    if start == -1:
        return None
    depth = 0
    for i in range(start, len(text)):
        ch = text[i]
        if ch == "{":
            depth += 1
        elif ch == "}":
            depth -= 1
            if depth == 0:
                return text[start : i + 1]
    return None


def parse_llm_json(text: str) -> Tuple[Dict[str, Any], Optional[str]]:
    """

    Parse strict JSON from an LLM response.



    Returns (obj, error). If parsing fails, obj will be {}, error will be a message.

    """

    raw = text.strip()
    try:
        return json.loads(raw), None
    except Exception:
        candidate = _extract_first_json_object(raw)
        if not candidate:
            return {}, "No JSON object found in model output."
        try:
            return json.loads(candidate), None
        except Exception as e:
            return {}, f"Failed to parse JSON: {e}"


def append_agent_log(state: Dict[str, Any], *, agent: str, event: str, payload: Any) -> Dict[str, Any]:
    logs = list(state.get("agent_logs") or [])
    logs.append(
        {
            "ts": utc_now_iso(),
            "agent": agent,
            "event": event,
            "payload": payload,
        }
    )
    return {"agent_logs": logs}