lwant commited on
Commit
cb13db3
Β·
1 Parent(s): 2d3a434

Add Phoenix and Tavily integrations, update dependencies, and enhance documentation

Browse files

Introduced `PHOENIX_API_KEY` and `TAVILY_API_KEY` environment variables for new Phoenix and Tavily API integrations. Updated `uv.lock` to include dependencies like `tavily-python`, `arize-phoenix`, and related tools. Expanded README with installation instructions, observability setup, and troubleshooting steps. Enhanced project capabilities with compatibility improvements.

Files changed (5) hide show
  1. .example.env +4 -0
  2. README.md +28 -1
  3. pyproject.toml +5 -0
  4. src/gaia_solving_agent/__init__.py +3 -1
  5. uv.lock +0 -0
.example.env CHANGED
@@ -12,3 +12,7 @@
12
 
13
  HF_TOKEN = "hf_xxxxxx"
14
  NEBIUS_API_TOKEN = "xxxxxxxx"
 
 
 
 
 
12
 
13
  HF_TOKEN = "hf_xxxxxx"
14
  NEBIUS_API_TOKEN = "xxxxxxxx"
15
+ # See https://docs.llamaindex.ai/en/stable/module_guides/observability/#llamatrace-hosted-arize-phoenix
16
+ PHOENIX_API_KEY = "xxxxxxxxxxxxxxxxxxx:xxxxxxx"
17
+
18
+ TAVILY_API_KEY = "tvly-dev-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
README.md CHANGED
@@ -13,4 +13,31 @@ hf_oauth: true
13
  hf_oauth_expiration_minutes: 480
14
  ---
15
 
16
- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
13
  hf_oauth_expiration_minutes: 480
14
  ---
15
 
16
+ Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
17
+
18
+ # Agent course final assignement
19
+
20
+ ## Install
21
+
22
+ ## LlamaIndex observability
23
+
24
+ ```python
25
+ from gaia_solving_agent.telemetry import set_telemetry
26
+
27
+
28
+ set_telemetry()
29
+
30
+ # Run your agent
31
+ ...
32
+ ```
33
+
34
+ ## Troubleshouting
35
+
36
+ ### Debug log level for OPENAPI like
37
+ ```bash
38
+ export OPENAI_LOG=debug
39
+ ```
40
+ ```python
41
+ import os
42
+ os.environ["OPENAI_LOG"] = "debug"
43
+ ```
pyproject.toml CHANGED
@@ -9,7 +9,10 @@ dependencies = [
9
  "llama-index>=0.12.43",
10
  "llama-index-llms-huggingface-api>=0.5.0",
11
  "llama-index-llms-nebius>=0.1.2",
 
 
12
  "requests>=2.32.4",
 
13
  ]
14
 
15
  [tool.pytest.ini_options]
@@ -17,5 +20,7 @@ pythonpath = ["config", "packages/src", "services"]
17
 
18
  [dependency-groups]
19
  dev = [
 
 
20
  "llama-index-utils-workflow>=0.3.4",
21
  ]
 
9
  "llama-index>=0.12.43",
10
  "llama-index-llms-huggingface-api>=0.5.0",
11
  "llama-index-llms-nebius>=0.1.2",
12
+ "llama-index-tools-duckduckgo>=0.3.0",
13
+ "llama-index-tools-requests>=0.4.0",
14
  "requests>=2.32.4",
15
+ "tavily-python>=0.7.8",
16
  ]
17
 
18
  [tool.pytest.ini_options]
 
20
 
21
  [dependency-groups]
22
  dev = [
23
+ "arize-phoenix>=10.15.0",
24
+ "llama-index-callbacks-arize-phoenix>=0.5.1",
25
  "llama-index-utils-workflow>=0.3.4",
26
  ]
src/gaia_solving_agent/__init__.py CHANGED
@@ -4,4 +4,6 @@ from dotenv import load_dotenv
4
  # Load the .env file
5
  load_dotenv()
6
  HF_API_TOKEN = os.getenv("HF_TOKEN")
7
- NEBIUS_API_KEY = os.getenv("NEBIUS_API_TOKEN")
 
 
 
4
  # Load the .env file
5
  load_dotenv()
6
  HF_API_TOKEN = os.getenv("HF_TOKEN")
7
+ NEBIUS_API_KEY = os.getenv("NEBIUS_API_TOKEN")
8
+ PHOENIX_API_KEY = os.getenv("PHOENIX_API_KEY")
9
+ TAVILY_API_KEY = os.getenv("TAVILY_API_KEY")
uv.lock CHANGED
The diff for this file is too large to render. See raw diff