Claude Claude commited on
Commit
68d83ed
Β·
unverified Β·
1 Parent(s): f9b2d64

Fix HF Spaces initialization and add comprehensive diagnostics

Browse files

This commit addresses the root cause of the "Failed to initialize LLM client" error
on Hugging Face Spaces with multiple improvements:

1. **Updated requirements.txt for HF Spaces compatibility**
- Upgraded anthropic from 0.18.1 to >=0.34.0 (modern API support)
- Added explicit httpx>=0.24.0 dependency
- Fixes version mismatch causing initialization failures

2. **Made AnthropicClient backward compatible**
- Added version detection for http_client parameter
- Graceful fallback for older SDK versions
- Better error handling with informative messages

3. **Added API key validation**
- Validates API key format (must start with 'sk-ant-')
- Provides clear error messages for invalid keys
- Helps users quickly identify configuration issues

4. **Enhanced diagnostics in web UI**
- Added expandable System Diagnostics panel in sidebar
- Shows API key status, package versions, and system info
- Displays detailed error tracebacks for debugging
- Auto-selects Local Be.FM when no API key detected

5. **Created comprehensive HF Spaces setup guide**
- Step-by-step instructions for configuring secrets
- Troubleshooting section for common issues
- Alternative options (Local Be.FM model)

These changes ensure the app works correctly on HF Spaces with proper error
reporting and clear guidance for users to fix configuration issues.

πŸ€– Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>

HF_SPACES_SETUP.md ADDED
@@ -0,0 +1,91 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Hugging Face Spaces Setup Guide
2
+
3
+ This guide explains how to properly configure your AI Personas app on Hugging Face Spaces.
4
+
5
+ ## Required Configuration
6
+
7
+ ### 1. Add Anthropic API Key to Secrets
8
+
9
+ Your Anthropic API key must be added to your Space's secrets:
10
+
11
+ 1. Go to your Space's page on Hugging Face
12
+ 2. Click on **Settings** (gear icon)
13
+ 3. Navigate to **Variables and secrets** section
14
+ 4. Click **Add a new secret**
15
+ 5. Enter:
16
+ - **Name:** `ANTHROPIC_API_KEY`
17
+ - **Value:** Your Anthropic API key (starts with `sk-ant-`)
18
+ 6. Click **Save**
19
+ 7. Restart your Space
20
+
21
+ ### 2. Verify API Key Format
22
+
23
+ Make sure your API key:
24
+ - Starts with `sk-ant-`
25
+ - Has no extra spaces or quotes
26
+ - Is a valid, non-expired key
27
+
28
+ You can get an API key from: https://console.anthropic.com/
29
+
30
+ ### 3. Check the Diagnostics
31
+
32
+ Once deployed, check the **πŸ” System Diagnostics** section in the sidebar:
33
+ - Should show "βœ“ Found" for API Key Status
34
+ - Should show your API key preview (first 15 characters)
35
+ - Should show the Anthropic SDK version
36
+
37
+ ## Troubleshooting
38
+
39
+ ### Error: "Failed to initialize LLM client"
40
+
41
+ **Possible causes:**
42
+
43
+ 1. **Missing API Key**
44
+ - Solution: Add `ANTHROPIC_API_KEY` to Space secrets (see step 1 above)
45
+
46
+ 2. **Invalid API Key Format**
47
+ - Error will show: "Invalid Anthropic API key format"
48
+ - Solution: Verify your key starts with `sk-ant-`
49
+
50
+ 3. **Expired or Invalid API Key**
51
+ - Solution: Generate a new key from Anthropic Console
52
+
53
+ 4. **Wrong Secret Name**
54
+ - Solution: Make sure the secret is named exactly `ANTHROPIC_API_KEY` (case-sensitive)
55
+
56
+ ### Using Local Be.FM Model Instead
57
+
58
+ If you don't want to use the Anthropic API:
59
+ 1. Select **"Local Be.FM"** from the model dropdown
60
+ 2. Note: First run will download ~16GB model
61
+ 3. Requires GPU-enabled Space (not available on free tier)
62
+
63
+ ## Alternative: Use .env File (Local Development Only)
64
+
65
+ For local development, you can use a `.env` file:
66
+
67
+ ```bash
68
+ ANTHROPIC_API_KEY=sk-ant-...your-key-here...
69
+ LLM_MODEL=claude-3-haiku-20240307
70
+ LLM_MAX_TOKENS=2048
71
+ LLM_TEMPERATURE=0.7
72
+ ```
73
+
74
+ **Note:** The `.env` file is ignored by git and won't be deployed to HF Spaces.
75
+
76
+ ## Checking Deployment Status
77
+
78
+ After deploying or changing secrets:
79
+
80
+ 1. Wait for Space to rebuild (usually 1-2 minutes)
81
+ 2. Check the **Build logs** for any errors
82
+ 3. Open the app and check the sidebar for error messages
83
+ 4. Use the **πŸ” System Diagnostics** expander to verify configuration
84
+
85
+ ## Need Help?
86
+
87
+ If you're still having issues:
88
+ 1. Check the detailed error in the expandable "πŸ” Show detailed error" section
89
+ 2. Verify your API key is valid at https://console.anthropic.com/
90
+ 3. Make sure you've restarted the Space after adding secrets
91
+ 4. Check that the secret name is exactly `ANTHROPIC_API_KEY` (case-sensitive)
requirements.txt CHANGED
@@ -3,8 +3,9 @@ python-dotenv==1.0.0
3
  pydantic==2.5.0
4
  pydantic-settings==2.1.0
5
 
6
- # LLM Integration
7
- anthropic==0.18.1
 
8
 
9
  # API Framework
10
  fastapi==0.109.0
 
3
  pydantic==2.5.0
4
  pydantic-settings==2.1.0
5
 
6
+ # LLM Integration - Updated for HF Spaces compatibility
7
+ anthropic>=0.34.0
8
+ httpx>=0.24.0
9
 
10
  # API Framework
11
  fastapi==0.109.0
src/llm/anthropic_client.py CHANGED
@@ -34,21 +34,41 @@ class AnthropicClient:
34
  "Anthropic API key must be provided or set in ANTHROPIC_API_KEY env var"
35
  )
36
 
 
 
 
 
 
 
37
  self.model = model or os.getenv("LLM_MODEL", "claude-3-5-sonnet-20241022")
38
  self.max_tokens = int(os.getenv("LLM_MAX_TOKENS", max_tokens))
39
  self.temperature = float(os.getenv("LLM_TEMPERATURE", temperature))
40
 
41
  # Initialize Anthropic client with custom httpx client to avoid proxy issues
42
  # This is needed for compatibility with some hosting environments (like HF Spaces)
43
- import httpx
44
-
45
- # Create httpx client without default proxy configuration
46
- http_client = httpx.Client(
47
- timeout=httpx.Timeout(60.0, connect=10.0),
48
- limits=httpx.Limits(max_keepalive_connections=5, max_connections=10),
49
- )
50
-
51
- self.client = Anthropic(api_key=self.api_key, http_client=http_client)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
52
 
53
  def generate_response(
54
  self,
 
34
  "Anthropic API key must be provided or set in ANTHROPIC_API_KEY env var"
35
  )
36
 
37
+ # Validate API key format (should start with sk-ant-)
38
+ if not self.api_key.startswith("sk-ant-"):
39
+ raise ValueError(
40
+ f"Invalid Anthropic API key format. Key should start with 'sk-ant-' but got '{self.api_key[:10]}...'"
41
+ )
42
+
43
  self.model = model or os.getenv("LLM_MODEL", "claude-3-5-sonnet-20241022")
44
  self.max_tokens = int(os.getenv("LLM_MAX_TOKENS", max_tokens))
45
  self.temperature = float(os.getenv("LLM_TEMPERATURE", temperature))
46
 
47
  # Initialize Anthropic client with custom httpx client to avoid proxy issues
48
  # This is needed for compatibility with some hosting environments (like HF Spaces)
49
+ # Note: http_client parameter is only supported in newer versions (>=0.34.0)
50
+ try:
51
+ import httpx
52
+ import inspect
53
+
54
+ # Check if Anthropic client supports http_client parameter
55
+ sig = inspect.signature(Anthropic.__init__)
56
+ supports_http_client = 'http_client' in sig.parameters
57
+
58
+ if supports_http_client:
59
+ # Create httpx client without default proxy configuration
60
+ http_client = httpx.Client(
61
+ timeout=httpx.Timeout(60.0, connect=10.0),
62
+ limits=httpx.Limits(max_keepalive_connections=5, max_connections=10),
63
+ )
64
+ self.client = Anthropic(api_key=self.api_key, http_client=http_client)
65
+ else:
66
+ # Fallback for older versions
67
+ self.client = Anthropic(api_key=self.api_key)
68
+ except Exception as e:
69
+ # If httpx setup fails, use basic client
70
+ print(f"Warning: Could not set up custom HTTP client: {e}")
71
+ self.client = Anthropic(api_key=self.api_key)
72
 
73
  def generate_response(
74
  self,
web_app.py CHANGED
@@ -113,6 +113,30 @@ llm_client = None # Initialize to None to avoid NameError
113
  with st.sidebar:
114
  st.title("πŸ€– LLM Model")
115
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
116
  # Check API key availability
117
  api_key_available = bool(os.getenv("ANTHROPIC_API_KEY"))
118
  if not api_key_available:
 
113
  with st.sidebar:
114
  st.title("πŸ€– LLM Model")
115
 
116
+ # Diagnostics (can be expanded if needed)
117
+ with st.expander("πŸ” System Diagnostics"):
118
+ api_key = os.getenv("ANTHROPIC_API_KEY")
119
+ st.write(f"**API Key Status:** {'βœ“ Found' if api_key else 'βœ— Missing'}")
120
+ if api_key:
121
+ st.write(f"**API Key Preview:** `{api_key[:15]}...`")
122
+
123
+ # Show package versions
124
+ try:
125
+ import anthropic
126
+ st.write(f"**Anthropic SDK:** v{anthropic.__version__}")
127
+ except:
128
+ st.write("**Anthropic SDK:** Not installed")
129
+
130
+ try:
131
+ import torch
132
+ st.write(f"**PyTorch:** v{torch.__version__}")
133
+ st.write(f"**Device:** {torch.device('mps' if torch.backends.mps.is_available() else 'cpu')}")
134
+ except:
135
+ st.write("**PyTorch:** Not installed")
136
+
137
+ st.write(f"**Python:** {sys.version.split()[0]}")
138
+ st.write(f"**Streamlit:** v{st.__version__}")
139
+
140
  # Check API key availability
141
  api_key_available = bool(os.getenv("ANTHROPIC_API_KEY"))
142
  if not api_key_available: