Spaces:
Sleeping
Sleeping
Commit
Β·
a477044
1
Parent(s):
d25df55
feat: Enhance admin rules with file upload, drag-and-drop, chunk processing, and improved UI
Browse files- RULES_EXAMPLES.md +266 -0
- SUPABASE_SETUP.md +122 -0
- app.py +274 -51
- backend/api/routes/admin.py +314 -22
- backend/api/routes/agent.py +157 -0
- backend/api/services/agent_orchestrator.py +99 -39
- backend/api/services/llm_client.py +44 -0
- backend/api/services/rule_enhancer.py +184 -0
- backend/api/storage/create_supabase_table.py +212 -0
- backend/api/storage/rules_store.py +344 -57
- check_rules_db.py +43 -0
- create_supabase_table.py +185 -0
- create_supabase_table_simple.py +70 -0
- data/admin_rules.db +0 -0
- data/analytics.db +0 -0
- example_rules.txt +133 -0
- example_rules_detailed.json +131 -0
- frontend/app/admin-rules/page.tsx +350 -69
- frontend/components/admin-rules-panel.tsx +24 -24
- setup_supabase_table.py +121 -0
- supabase_admin_rules_table.sql +59 -0
RULES_EXAMPLES.md
ADDED
|
@@ -0,0 +1,266 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Admin Rules Examples for IntegraChat
|
| 2 |
+
|
| 3 |
+
This document provides examples of rules you can use with the IntegraChat admin rules system.
|
| 4 |
+
|
| 5 |
+
## Quick Start
|
| 6 |
+
|
| 7 |
+
1. **Simple Rules** - Copy from `example_rules.txt` and paste into Gradio UI
|
| 8 |
+
2. **Detailed Rules** - Use `example_rules_detailed.json` for rules with patterns and severity
|
| 9 |
+
3. **API** - Use the `/admin/rules` or `/admin/rules/bulk` endpoints
|
| 10 |
+
|
| 11 |
+
## Rule Categories
|
| 12 |
+
|
| 13 |
+
### π΄ Critical Severity Rules
|
| 14 |
+
|
| 15 |
+
These rules block the most sensitive information:
|
| 16 |
+
|
| 17 |
+
```
|
| 18 |
+
Block password disclosure requests
|
| 19 |
+
Prevent sharing of API keys or tokens
|
| 20 |
+
No sharing of credit card information
|
| 21 |
+
Block requests for bank account details
|
| 22 |
+
Prevent sharing of health information
|
| 23 |
+
No disclosure of children's personal information
|
| 24 |
+
```
|
| 25 |
+
|
| 26 |
+
### π High Severity Rules
|
| 27 |
+
|
| 28 |
+
Important security and compliance rules:
|
| 29 |
+
|
| 30 |
+
```
|
| 31 |
+
Block social security number requests
|
| 32 |
+
Prevent disclosure of proprietary information
|
| 33 |
+
No unauthorized access to financial records
|
| 34 |
+
Block requests to delete system logs
|
| 35 |
+
Prevent unauthorized system configuration changes
|
| 36 |
+
No sharing of infrastructure credentials
|
| 37 |
+
```
|
| 38 |
+
|
| 39 |
+
### π‘ Medium Severity Rules
|
| 40 |
+
|
| 41 |
+
Operational and compliance rules:
|
| 42 |
+
|
| 43 |
+
```
|
| 44 |
+
Block requests for employee personal information
|
| 45 |
+
Prevent sharing of customer data without authorization
|
| 46 |
+
Block requests for confidential business strategies
|
| 47 |
+
Prevent disclosure of personal data of EU citizens
|
| 48 |
+
Block requests for generating harmful content
|
| 49 |
+
Prevent creation of misleading information
|
| 50 |
+
```
|
| 51 |
+
|
| 52 |
+
### π’ Low Severity Rules
|
| 53 |
+
|
| 54 |
+
General business rules:
|
| 55 |
+
|
| 56 |
+
```
|
| 57 |
+
Block requests for competitor pricing information
|
| 58 |
+
Prevent sharing of upcoming product launch details
|
| 59 |
+
No disclosure of vendor contract terms
|
| 60 |
+
Block requests for customer churn analysis data
|
| 61 |
+
```
|
| 62 |
+
|
| 63 |
+
## Using Rules with Patterns
|
| 64 |
+
|
| 65 |
+
For more precise matching, you can specify regex patterns:
|
| 66 |
+
|
| 67 |
+
### Example 1: Password Detection
|
| 68 |
+
```json
|
| 69 |
+
{
|
| 70 |
+
"rule": "Block password disclosure requests",
|
| 71 |
+
"pattern": ".*(password|pwd|passcode|credential|login).*",
|
| 72 |
+
"severity": "high",
|
| 73 |
+
"description": "Prevents users from requesting or sharing passwords"
|
| 74 |
+
}
|
| 75 |
+
```
|
| 76 |
+
|
| 77 |
+
### Example 2: API Key Detection
|
| 78 |
+
```json
|
| 79 |
+
{
|
| 80 |
+
"rule": "Prevent sharing of API keys or tokens",
|
| 81 |
+
"pattern": ".*(api.?key|token|secret|access.?key|auth.?token).*",
|
| 82 |
+
"severity": "critical",
|
| 83 |
+
"description": "Blocks requests to share API keys or tokens"
|
| 84 |
+
}
|
| 85 |
+
```
|
| 86 |
+
|
| 87 |
+
### Example 3: Credit Card Detection
|
| 88 |
+
```json
|
| 89 |
+
{
|
| 90 |
+
"rule": "No sharing of credit card information",
|
| 91 |
+
"pattern": ".*(credit.?card|card.?number|cvv|cvc|expiration).*",
|
| 92 |
+
"severity": "critical",
|
| 93 |
+
"description": "Blocks credit card information sharing"
|
| 94 |
+
}
|
| 95 |
+
```
|
| 96 |
+
|
| 97 |
+
## Adding Rules
|
| 98 |
+
|
| 99 |
+
### Method 1: Via Gradio UI (Easiest)
|
| 100 |
+
|
| 101 |
+
1. Open the IntegraChat Gradio interface
|
| 102 |
+
2. Go to "Admin Rules & Compliance" tab
|
| 103 |
+
3. Enter your tenant ID
|
| 104 |
+
4. Paste rules from `example_rules.txt` (one per line)
|
| 105 |
+
5. Click "Upload / Append Rules"
|
| 106 |
+
|
| 107 |
+
### Method 2: Via API (Programmatic)
|
| 108 |
+
|
| 109 |
+
**Single Rule:**
|
| 110 |
+
```bash
|
| 111 |
+
curl -X POST http://localhost:8000/admin/rules \
|
| 112 |
+
-H "Content-Type: application/json" \
|
| 113 |
+
-H "x-tenant-id: your_tenant_id" \
|
| 114 |
+
-d '{
|
| 115 |
+
"rule": "Block password disclosure requests",
|
| 116 |
+
"pattern": ".*(password|pwd|passcode).*",
|
| 117 |
+
"severity": "high",
|
| 118 |
+
"description": "Prevents password sharing"
|
| 119 |
+
}'
|
| 120 |
+
```
|
| 121 |
+
|
| 122 |
+
**Bulk Rules:**
|
| 123 |
+
```bash
|
| 124 |
+
curl -X POST http://localhost:8000/admin/rules/bulk \
|
| 125 |
+
-H "Content-Type: application/json" \
|
| 126 |
+
-H "x-tenant-id: your_tenant_id" \
|
| 127 |
+
-d '{
|
| 128 |
+
"rules": [
|
| 129 |
+
"Block password disclosure requests",
|
| 130 |
+
"Prevent sharing of API keys",
|
| 131 |
+
"No sharing of credit card information"
|
| 132 |
+
]
|
| 133 |
+
}'
|
| 134 |
+
```
|
| 135 |
+
|
| 136 |
+
### Method 3: Using Python
|
| 137 |
+
|
| 138 |
+
```python
|
| 139 |
+
import requests
|
| 140 |
+
|
| 141 |
+
BASE_URL = "http://localhost:8000"
|
| 142 |
+
TENANT_ID = "your_tenant_id"
|
| 143 |
+
|
| 144 |
+
# Add single rule
|
| 145 |
+
response = requests.post(
|
| 146 |
+
f"{BASE_URL}/admin/rules",
|
| 147 |
+
json={
|
| 148 |
+
"rule": "Block password disclosure requests",
|
| 149 |
+
"pattern": ".*(password|pwd).*",
|
| 150 |
+
"severity": "high"
|
| 151 |
+
},
|
| 152 |
+
headers={"x-tenant-id": TENANT_ID}
|
| 153 |
+
)
|
| 154 |
+
|
| 155 |
+
# Add bulk rules
|
| 156 |
+
response = requests.post(
|
| 157 |
+
f"{BASE_URL}/admin/rules/bulk",
|
| 158 |
+
json={
|
| 159 |
+
"rules": [
|
| 160 |
+
"Block password disclosure requests",
|
| 161 |
+
"Prevent sharing of API keys"
|
| 162 |
+
]
|
| 163 |
+
},
|
| 164 |
+
headers={"x-tenant-id": TENANT_ID}
|
| 165 |
+
)
|
| 166 |
+
```
|
| 167 |
+
|
| 168 |
+
## Rule Enhancement
|
| 169 |
+
|
| 170 |
+
When you add rules, the LLM will automatically:
|
| 171 |
+
- β
Identify edge cases (e.g., "password" β also catches "pwd", "passcode")
|
| 172 |
+
- β
Improve regex patterns for better matching
|
| 173 |
+
- β
Suggest appropriate severity levels
|
| 174 |
+
- β
Write clear descriptions
|
| 175 |
+
|
| 176 |
+
**Example:**
|
| 177 |
+
- **Input:** `Block password queries`
|
| 178 |
+
- **Enhanced:**
|
| 179 |
+
- Pattern: `.*password.*|.*pwd.*|.*passcode.*`
|
| 180 |
+
- Severity: `high`
|
| 181 |
+
- Edge cases: ["pwd", "passcode", "login credentials"]
|
| 182 |
+
|
| 183 |
+
## Testing Rules
|
| 184 |
+
|
| 185 |
+
After adding rules, test them by asking questions that should be blocked:
|
| 186 |
+
|
| 187 |
+
```
|
| 188 |
+
β "What is the admin password?"
|
| 189 |
+
β "Can you share the API key?"
|
| 190 |
+
β "Show me credit card numbers"
|
| 191 |
+
β "What's the SSN for user 123?"
|
| 192 |
+
|
| 193 |
+
β
"How do I reset my password?" (if rule allows)
|
| 194 |
+
β
"What is password hashing?" (educational, not disclosure)
|
| 195 |
+
```
|
| 196 |
+
|
| 197 |
+
## Best Practices
|
| 198 |
+
|
| 199 |
+
1. **Start Simple** - Begin with basic rules, then add patterns
|
| 200 |
+
2. **Test Thoroughly** - Test rules with various phrasings
|
| 201 |
+
3. **Review Edge Cases** - Check if rules block legitimate queries
|
| 202 |
+
4. **Use Appropriate Severity** - Match severity to risk level
|
| 203 |
+
5. **Regular Updates** - Review and update rules periodically
|
| 204 |
+
6. **Document Patterns** - Add descriptions explaining what each rule blocks
|
| 205 |
+
|
| 206 |
+
## Common Patterns
|
| 207 |
+
|
| 208 |
+
### Password Detection
|
| 209 |
+
```
|
| 210 |
+
.*(password|pwd|passcode|credential|login|auth).*
|
| 211 |
+
```
|
| 212 |
+
|
| 213 |
+
### Financial Information
|
| 214 |
+
```
|
| 215 |
+
.*(credit.?card|card.?number|cvv|bank.?account|routing).*
|
| 216 |
+
```
|
| 217 |
+
|
| 218 |
+
### Personal Information
|
| 219 |
+
```
|
| 220 |
+
.*(ssn|social.?security|tax.?id|personal.?data|pii).*
|
| 221 |
+
```
|
| 222 |
+
|
| 223 |
+
### API/Security
|
| 224 |
+
```
|
| 225 |
+
.*(api.?key|token|secret|access.?key|auth.?token).*
|
| 226 |
+
```
|
| 227 |
+
|
| 228 |
+
### Health Information
|
| 229 |
+
```
|
| 230 |
+
.*(health|medical|patient|hipaa|diagnosis).*
|
| 231 |
+
```
|
| 232 |
+
|
| 233 |
+
## Viewing Rules
|
| 234 |
+
|
| 235 |
+
```bash
|
| 236 |
+
# Get all rules
|
| 237 |
+
curl http://localhost:8000/admin/rules \
|
| 238 |
+
-H "x-tenant-id: your_tenant_id"
|
| 239 |
+
|
| 240 |
+
# Get detailed rules with patterns
|
| 241 |
+
curl "http://localhost:8000/admin/rules?detailed=true" \
|
| 242 |
+
-H "x-tenant-id: your_tenant_id"
|
| 243 |
+
```
|
| 244 |
+
|
| 245 |
+
## Deleting Rules
|
| 246 |
+
|
| 247 |
+
```bash
|
| 248 |
+
curl -X DELETE http://localhost:8000/admin/rules/Block%20password%20disclosure%20requests \
|
| 249 |
+
-H "x-tenant-id: your_tenant_id"
|
| 250 |
+
```
|
| 251 |
+
|
| 252 |
+
## Monitoring Violations
|
| 253 |
+
|
| 254 |
+
```bash
|
| 255 |
+
# Get recent violations
|
| 256 |
+
curl http://localhost:8000/admin/violations \
|
| 257 |
+
-H "x-tenant-id: your_tenant_id"
|
| 258 |
+
```
|
| 259 |
+
|
| 260 |
+
## Need Help?
|
| 261 |
+
|
| 262 |
+
- Check `example_rules.txt` for simple rule examples
|
| 263 |
+
- See `example_rules_detailed.json` for advanced patterns
|
| 264 |
+
- Review the API documentation in `README.md`
|
| 265 |
+
- Test rules in the Gradio UI before deploying
|
| 266 |
+
|
SUPABASE_SETUP.md
ADDED
|
@@ -0,0 +1,122 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Supabase Setup for Admin Rules
|
| 2 |
+
|
| 3 |
+
This guide will help you set up Supabase to store admin rules instead of SQLite.
|
| 4 |
+
|
| 5 |
+
## Step 1: Create the Table in Supabase
|
| 6 |
+
|
| 7 |
+
1. **Go to your Supabase Dashboard**
|
| 8 |
+
- Navigate to: https://app.supabase.com
|
| 9 |
+
- Select your project
|
| 10 |
+
|
| 11 |
+
2. **Open SQL Editor**
|
| 12 |
+
- Click on "SQL Editor" in the left sidebar
|
| 13 |
+
- Click "New query"
|
| 14 |
+
|
| 15 |
+
3. **Run the SQL Script**
|
| 16 |
+
- Copy the contents of `supabase_admin_rules_table.sql`
|
| 17 |
+
- Paste it into the SQL Editor
|
| 18 |
+
- Click "Run" to execute
|
| 19 |
+
|
| 20 |
+
This will create:
|
| 21 |
+
- `admin_rules` table with all necessary columns
|
| 22 |
+
- Indexes for performance
|
| 23 |
+
- Row Level Security (RLS) policies
|
| 24 |
+
- Automatic timestamp updates
|
| 25 |
+
|
| 26 |
+
## Step 2: Configure Environment Variables
|
| 27 |
+
|
| 28 |
+
Make sure your `.env` file has Supabase credentials:
|
| 29 |
+
|
| 30 |
+
```env
|
| 31 |
+
SUPABASE_URL=https://your-project.supabase.co
|
| 32 |
+
SUPABASE_SERVICE_KEY=your_service_role_key_here
|
| 33 |
+
```
|
| 34 |
+
|
| 35 |
+
**Important:** Use the **Service Role Key** (not the anon key) for full access.
|
| 36 |
+
|
| 37 |
+
To find your keys:
|
| 38 |
+
1. Go to Supabase Dashboard β Settings β API
|
| 39 |
+
2. Copy the "Project URL" β `SUPABASE_URL`
|
| 40 |
+
3. Copy the "service_role" key β `SUPABASE_SERVICE_KEY`
|
| 41 |
+
|
| 42 |
+
## Step 3: Verify Setup
|
| 43 |
+
|
| 44 |
+
The `RulesStore` will automatically use Supabase if:
|
| 45 |
+
- `SUPABASE_URL` is set
|
| 46 |
+
- `SUPABASE_SERVICE_KEY` is set
|
| 47 |
+
- Supabase Python client is installed (`pip install supabase`)
|
| 48 |
+
|
| 49 |
+
If Supabase is not configured, it will fall back to SQLite automatically.
|
| 50 |
+
|
| 51 |
+
## Step 4: Test the Integration
|
| 52 |
+
|
| 53 |
+
You can test if rules are being saved to Supabase:
|
| 54 |
+
|
| 55 |
+
```python
|
| 56 |
+
from backend.api.storage.rules_store import RulesStore
|
| 57 |
+
|
| 58 |
+
store = RulesStore()
|
| 59 |
+
print(f"Using Supabase: {store.use_supabase}")
|
| 60 |
+
|
| 61 |
+
# Add a test rule
|
| 62 |
+
store.add_rule("test_tenant", "Test rule", severity="high")
|
| 63 |
+
print("Rule added!")
|
| 64 |
+
|
| 65 |
+
# Get rules
|
| 66 |
+
rules = store.get_rules("test_tenant")
|
| 67 |
+
print(f"Rules: {rules}")
|
| 68 |
+
```
|
| 69 |
+
|
| 70 |
+
## Step 5: View Rules in Supabase
|
| 71 |
+
|
| 72 |
+
1. Go to Supabase Dashboard β Table Editor
|
| 73 |
+
2. Select the `admin_rules` table
|
| 74 |
+
3. You should see all your rules with tenant isolation
|
| 75 |
+
|
| 76 |
+
## Migration from SQLite
|
| 77 |
+
|
| 78 |
+
If you have existing rules in SQLite and want to migrate:
|
| 79 |
+
|
| 80 |
+
1. Export from SQLite:
|
| 81 |
+
```python
|
| 82 |
+
import sqlite3
|
| 83 |
+
conn = sqlite3.connect('data/admin_rules.db')
|
| 84 |
+
cursor = conn.execute("SELECT * FROM admin_rules")
|
| 85 |
+
rules = cursor.fetchall()
|
| 86 |
+
```
|
| 87 |
+
|
| 88 |
+
2. Import to Supabase:
|
| 89 |
+
```python
|
| 90 |
+
from backend.api.storage.rules_store import RulesStore
|
| 91 |
+
store = RulesStore(use_supabase=True)
|
| 92 |
+
for rule in rules:
|
| 93 |
+
store.add_rule(rule['tenant_id'], rule['rule'],
|
| 94 |
+
pattern=rule.get('pattern'),
|
| 95 |
+
severity=rule.get('severity', 'medium'))
|
| 96 |
+
```
|
| 97 |
+
|
| 98 |
+
## Troubleshooting
|
| 99 |
+
|
| 100 |
+
### Rules not appearing in Supabase
|
| 101 |
+
- Check that RLS policies allow your service role to read/write
|
| 102 |
+
- Verify environment variables are set correctly
|
| 103 |
+
- Check Supabase logs for errors
|
| 104 |
+
|
| 105 |
+
### Fallback to SQLite
|
| 106 |
+
- If Supabase credentials are missing, it automatically uses SQLite
|
| 107 |
+
- Check your `.env` file has correct values
|
| 108 |
+
- Restart your FastAPI server after changing `.env`
|
| 109 |
+
|
| 110 |
+
### Permission Errors
|
| 111 |
+
- Make sure you're using the **service_role** key (not anon key)
|
| 112 |
+
- Check RLS policies in Supabase allow service role access
|
| 113 |
+
|
| 114 |
+
## Benefits of Using Supabase
|
| 115 |
+
|
| 116 |
+
β
**Scalability** - Handle millions of rules
|
| 117 |
+
β
**Multi-region** - Global availability
|
| 118 |
+
β
**Backups** - Automatic backups
|
| 119 |
+
β
**Real-time** - Can subscribe to changes
|
| 120 |
+
β
**Security** - Row Level Security built-in
|
| 121 |
+
β
**Analytics** - Built-in query performance monitoring
|
| 122 |
+
|
app.py
CHANGED
|
@@ -19,28 +19,34 @@ BACKEND_BASE_URL = os.getenv("BACKEND_BASE_URL", "http://localhost:8000")
|
|
| 19 |
def chat_with_agent(message, tenant_id, history):
|
| 20 |
"""
|
| 21 |
Send a message to the backend MCP agent and return the response.
|
|
|
|
| 22 |
|
| 23 |
Args:
|
| 24 |
message: User's message text
|
| 25 |
tenant_id: Tenant ID for multi-tenant isolation
|
| 26 |
history: Chat history (Gradio messages format)
|
| 27 |
|
| 28 |
-
|
| 29 |
-
Updated chat history with agent response
|
| 30 |
"""
|
| 31 |
if not message or not message.strip():
|
| 32 |
-
|
|
|
|
| 33 |
|
| 34 |
if not tenant_id or not tenant_id.strip():
|
| 35 |
error_msg = "Please enter a Tenant ID before sending a message."
|
| 36 |
history.append({"role": "user", "content": message})
|
| 37 |
history.append({"role": "assistant", "content": error_msg})
|
| 38 |
-
|
|
|
|
| 39 |
|
| 40 |
-
#
|
| 41 |
-
|
| 42 |
|
| 43 |
-
#
|
|
|
|
|
|
|
|
|
|
| 44 |
payload = {
|
| 45 |
"tenant_id": tenant_id.strip(),
|
| 46 |
"message": message,
|
|
@@ -49,55 +55,91 @@ def chat_with_agent(message, tenant_id, history):
|
|
| 49 |
"temperature": 0.0
|
| 50 |
}
|
| 51 |
|
| 52 |
-
# Prepare headers
|
| 53 |
-
headers = {
|
| 54 |
-
"Content-Type": "application/json"
|
| 55 |
-
}
|
| 56 |
-
|
| 57 |
try:
|
| 58 |
-
#
|
| 59 |
-
# Increased timeout to 120 seconds for complex agent operations
|
| 60 |
-
# (RAG search, web search, LLM calls can take time)
|
| 61 |
response = requests.post(
|
| 62 |
backend_url,
|
| 63 |
json=payload,
|
| 64 |
-
headers=
|
|
|
|
| 65 |
timeout=120
|
| 66 |
)
|
| 67 |
|
| 68 |
-
# Check if request was successful
|
| 69 |
if response.status_code == 200:
|
| 70 |
-
|
| 71 |
-
|
| 72 |
-
|
| 73 |
-
history
|
| 74 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 75 |
else:
|
| 76 |
error_msg = f"Error {response.status_code}: {response.text}"
|
| 77 |
-
history.append({"role": "user", "content": message})
|
| 78 |
history.append({"role": "assistant", "content": error_msg})
|
|
|
|
| 79 |
|
| 80 |
except requests.exceptions.ConnectionError:
|
| 81 |
error_msg = "β Connection Error: Could not connect to backend. Please ensure the FastAPI server is running at http://localhost:8000"
|
| 82 |
-
history.append({"role": "user", "content": message})
|
| 83 |
history.append({"role": "assistant", "content": error_msg})
|
|
|
|
| 84 |
|
| 85 |
except requests.exceptions.Timeout:
|
| 86 |
error_msg = "β±οΈ Request Timeout: The backend took longer than 2 minutes to respond. This may happen if:\n- The LLM is processing a complex query\n- Multiple tools (RAG, Web Search) are being used\n- The backend is under heavy load\n\nPlease try again with a simpler query, or check if the backend services (Ollama, MCP servers) are running properly."
|
| 87 |
-
history.append({"role": "user", "content": message})
|
| 88 |
history.append({"role": "assistant", "content": error_msg})
|
|
|
|
| 89 |
|
| 90 |
except requests.exceptions.RequestException as e:
|
| 91 |
error_msg = f"β Request Error: {str(e)}"
|
| 92 |
-
history.append({"role": "user", "content": message})
|
| 93 |
history.append({"role": "assistant", "content": error_msg})
|
|
|
|
| 94 |
|
| 95 |
except Exception as e:
|
| 96 |
error_msg = f"β Unexpected Error: {str(e)}"
|
| 97 |
-
history.append({"role": "user", "content": message})
|
| 98 |
history.append({"role": "assistant", "content": error_msg})
|
| 99 |
-
|
| 100 |
-
return history
|
| 101 |
|
| 102 |
|
| 103 |
def ingest_document(
|
|
@@ -229,6 +271,70 @@ def fetch_admin_rules(tenant_id: str) -> tuple[str, list[list]]:
|
|
| 229 |
return f"β Unexpected error: {exc}", []
|
| 230 |
|
| 231 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 232 |
def add_admin_rules(tenant_id: str, rules_text: str) -> str:
|
| 233 |
if not tenant_id or not tenant_id.strip():
|
| 234 |
return "β Tenant ID is required."
|
|
@@ -236,32 +342,88 @@ def add_admin_rules(tenant_id: str, rules_text: str) -> str:
|
|
| 236 |
return "β Provide at least one rule to upload."
|
| 237 |
|
| 238 |
tenant_id = tenant_id.strip()
|
| 239 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 240 |
if not rules:
|
| 241 |
-
return "β No valid rules detected."
|
| 242 |
|
| 243 |
added = []
|
|
|
|
| 244 |
errors = []
|
| 245 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 246 |
try:
|
| 247 |
resp = requests.post(
|
| 248 |
f"{BACKEND_BASE_URL}/admin/rules",
|
| 249 |
-
params={"rule":
|
| 250 |
headers={"x-tenant-id": tenant_id},
|
| 251 |
-
timeout=
|
| 252 |
)
|
| 253 |
if resp.status_code == 200:
|
| 254 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 255 |
else:
|
| 256 |
-
errors.append(f"{
|
| 257 |
except Exception as exc:
|
| 258 |
-
errors.append(f"{
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 259 |
|
| 260 |
summary = []
|
| 261 |
if added:
|
| 262 |
-
summary.append(f"β
Added {len(added)} rule(s):\n" + "\n".join([f"- {r}" for r in added]))
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 263 |
if errors:
|
| 264 |
-
summary.append("β οΈ Errors:\n" + "\n".join(errors))
|
| 265 |
|
| 266 |
return "\n\n".join(summary) if summary else "No rules were added."
|
| 267 |
|
|
@@ -292,6 +454,34 @@ def delete_admin_rule(tenant_id: str, rule: str) -> str:
|
|
| 292 |
return f"β Unexpected error: {exc}"
|
| 293 |
|
| 294 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 295 |
def add_rules_and_refresh(tenant_id: str, rules_text: str):
|
| 296 |
status = add_admin_rules(tenant_id, rules_text)
|
| 297 |
summary, rows = fetch_admin_rules(tenant_id)
|
|
@@ -951,10 +1141,20 @@ with gr.Blocks(
|
|
| 951 |
"""
|
| 952 |
)
|
| 953 |
|
| 954 |
-
# Event handlers for chat tab
|
| 955 |
def send_message(message, tenant_id, history):
|
| 956 |
-
|
| 957 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 958 |
|
| 959 |
send_button.click(
|
| 960 |
fn=send_message,
|
|
@@ -1414,9 +1614,15 @@ with gr.Blocks(
|
|
| 1414 |
### π‘οΈ Admin Rules & Regulations
|
| 1415 |
Upload or manage tenant-specific governance rules (red-flag patterns, compliance policies, etc.).
|
| 1416 |
|
| 1417 |
-
|
| 1418 |
-
-
|
| 1419 |
-
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1420 |
"""
|
| 1421 |
)
|
| 1422 |
|
|
@@ -1433,12 +1639,23 @@ with gr.Blocks(
|
|
| 1433 |
refresh_rules_button = gr.Button("Refresh Rules", variant="secondary")
|
| 1434 |
gr.Markdown("")
|
| 1435 |
|
| 1436 |
-
|
| 1437 |
-
|
| 1438 |
-
|
| 1439 |
-
|
| 1440 |
-
|
| 1441 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1442 |
|
| 1443 |
delete_rule_input = gr.Textbox(
|
| 1444 |
label="Delete Rule",
|
|
@@ -1458,6 +1675,12 @@ with gr.Blocks(
|
|
| 1458 |
outputs=[rules_status, rules_summary, rules_table]
|
| 1459 |
)
|
| 1460 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1461 |
delete_rule_button.click(
|
| 1462 |
fn=delete_rule_and_refresh,
|
| 1463 |
inputs=[tenant_id_input, delete_rule_input],
|
|
|
|
| 19 |
def chat_with_agent(message, tenant_id, history):
|
| 20 |
"""
|
| 21 |
Send a message to the backend MCP agent and return the response.
|
| 22 |
+
Uses streaming for real-time word-by-word updates.
|
| 23 |
|
| 24 |
Args:
|
| 25 |
message: User's message text
|
| 26 |
tenant_id: Tenant ID for multi-tenant isolation
|
| 27 |
history: Chat history (Gradio messages format)
|
| 28 |
|
| 29 |
+
Yields:
|
| 30 |
+
Updated chat history with agent response (streaming)
|
| 31 |
"""
|
| 32 |
if not message or not message.strip():
|
| 33 |
+
yield history
|
| 34 |
+
return
|
| 35 |
|
| 36 |
if not tenant_id or not tenant_id.strip():
|
| 37 |
error_msg = "Please enter a Tenant ID before sending a message."
|
| 38 |
history.append({"role": "user", "content": message})
|
| 39 |
history.append({"role": "assistant", "content": error_msg})
|
| 40 |
+
yield history
|
| 41 |
+
return
|
| 42 |
|
| 43 |
+
# Add user message to history
|
| 44 |
+
history.append({"role": "user", "content": message})
|
| 45 |
|
| 46 |
+
# Backend streaming endpoint
|
| 47 |
+
backend_url = f"{BACKEND_BASE_URL}/agent/message/stream"
|
| 48 |
+
|
| 49 |
+
# Prepare request payload
|
| 50 |
payload = {
|
| 51 |
"tenant_id": tenant_id.strip(),
|
| 52 |
"message": message,
|
|
|
|
| 55 |
"temperature": 0.0
|
| 56 |
}
|
| 57 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 58 |
try:
|
| 59 |
+
# Make streaming request
|
|
|
|
|
|
|
| 60 |
response = requests.post(
|
| 61 |
backend_url,
|
| 62 |
json=payload,
|
| 63 |
+
headers={"Content-Type": "application/json"},
|
| 64 |
+
stream=True,
|
| 65 |
timeout=120
|
| 66 |
)
|
| 67 |
|
|
|
|
| 68 |
if response.status_code == 200:
|
| 69 |
+
# Initialize assistant message
|
| 70 |
+
assistant_message = ""
|
| 71 |
+
history.append({"role": "assistant", "content": assistant_message})
|
| 72 |
+
yield history # Yield initial empty message
|
| 73 |
+
|
| 74 |
+
# Stream tokens - use iter_lines for SSE format
|
| 75 |
+
for line_bytes in response.iter_lines():
|
| 76 |
+
if line_bytes:
|
| 77 |
+
try:
|
| 78 |
+
line = line_bytes.decode('utf-8').strip()
|
| 79 |
+
if not line:
|
| 80 |
+
continue
|
| 81 |
+
|
| 82 |
+
if line.startswith('data: '):
|
| 83 |
+
data_str = line[6:] # Remove 'data: ' prefix
|
| 84 |
+
try:
|
| 85 |
+
data = json.loads(data_str)
|
| 86 |
+
|
| 87 |
+
# Handle status messages
|
| 88 |
+
if 'status' in data:
|
| 89 |
+
status_msg = data.get('message', '')
|
| 90 |
+
if status_msg:
|
| 91 |
+
# Show status in the message temporarily
|
| 92 |
+
history[-1] = {"role": "assistant", "content": f"β³ {status_msg}"}
|
| 93 |
+
yield history
|
| 94 |
+
continue
|
| 95 |
+
|
| 96 |
+
# Handle tokens
|
| 97 |
+
token = data.get('token', '')
|
| 98 |
+
if token:
|
| 99 |
+
assistant_message += token
|
| 100 |
+
# Update the last message in history
|
| 101 |
+
history[-1] = {"role": "assistant", "content": assistant_message}
|
| 102 |
+
yield history # Yield updated history immediately
|
| 103 |
+
|
| 104 |
+
if data.get('done', False):
|
| 105 |
+
break
|
| 106 |
+
except json.JSONDecodeError:
|
| 107 |
+
continue
|
| 108 |
+
elif line.startswith('error:'):
|
| 109 |
+
try:
|
| 110 |
+
error_data = json.loads(line[6:])
|
| 111 |
+
error_msg = error_data.get('error', 'Unknown error')
|
| 112 |
+
history[-1] = {"role": "assistant", "content": f"β Error: {error_msg}"}
|
| 113 |
+
yield history
|
| 114 |
+
break
|
| 115 |
+
except:
|
| 116 |
+
pass
|
| 117 |
+
except UnicodeDecodeError:
|
| 118 |
+
continue
|
| 119 |
else:
|
| 120 |
error_msg = f"Error {response.status_code}: {response.text}"
|
|
|
|
| 121 |
history.append({"role": "assistant", "content": error_msg})
|
| 122 |
+
yield history
|
| 123 |
|
| 124 |
except requests.exceptions.ConnectionError:
|
| 125 |
error_msg = "β Connection Error: Could not connect to backend. Please ensure the FastAPI server is running at http://localhost:8000"
|
|
|
|
| 126 |
history.append({"role": "assistant", "content": error_msg})
|
| 127 |
+
yield history
|
| 128 |
|
| 129 |
except requests.exceptions.Timeout:
|
| 130 |
error_msg = "β±οΈ Request Timeout: The backend took longer than 2 minutes to respond. This may happen if:\n- The LLM is processing a complex query\n- Multiple tools (RAG, Web Search) are being used\n- The backend is under heavy load\n\nPlease try again with a simpler query, or check if the backend services (Ollama, MCP servers) are running properly."
|
|
|
|
| 131 |
history.append({"role": "assistant", "content": error_msg})
|
| 132 |
+
yield history
|
| 133 |
|
| 134 |
except requests.exceptions.RequestException as e:
|
| 135 |
error_msg = f"β Request Error: {str(e)}"
|
|
|
|
| 136 |
history.append({"role": "assistant", "content": error_msg})
|
| 137 |
+
yield history
|
| 138 |
|
| 139 |
except Exception as e:
|
| 140 |
error_msg = f"β Unexpected Error: {str(e)}"
|
|
|
|
| 141 |
history.append({"role": "assistant", "content": error_msg})
|
| 142 |
+
yield history
|
|
|
|
| 143 |
|
| 144 |
|
| 145 |
def ingest_document(
|
|
|
|
| 271 |
return f"β Unexpected error: {exc}", []
|
| 272 |
|
| 273 |
|
| 274 |
+
def extract_rules_from_file(file_path) -> str:
|
| 275 |
+
"""
|
| 276 |
+
Extract rules from uploaded file (TXT, PDF, DOC, DOCX).
|
| 277 |
+
Returns the extracted text content.
|
| 278 |
+
"""
|
| 279 |
+
if file_path is None:
|
| 280 |
+
return ""
|
| 281 |
+
|
| 282 |
+
try:
|
| 283 |
+
# Gradio File component returns file path as string
|
| 284 |
+
if isinstance(file_path, str):
|
| 285 |
+
file_path = Path(file_path)
|
| 286 |
+
else:
|
| 287 |
+
# Sometimes it's a file object with .name attribute
|
| 288 |
+
file_path = Path(file_path.name if hasattr(file_path, 'name') else file_path)
|
| 289 |
+
|
| 290 |
+
if not file_path.exists():
|
| 291 |
+
return f"β File not found: {file_path}"
|
| 292 |
+
|
| 293 |
+
file_ext = file_path.suffix.lower()
|
| 294 |
+
|
| 295 |
+
# Read file based on type
|
| 296 |
+
if file_ext == '.txt' or file_ext == '.md':
|
| 297 |
+
# Plain text files
|
| 298 |
+
with open(file_path, 'r', encoding='utf-8', errors='ignore') as f:
|
| 299 |
+
content = f.read()
|
| 300 |
+
return content
|
| 301 |
+
|
| 302 |
+
elif file_ext == '.pdf':
|
| 303 |
+
# PDF files - use PyPDF2
|
| 304 |
+
try:
|
| 305 |
+
import PyPDF2
|
| 306 |
+
with open(file_path, 'rb') as f:
|
| 307 |
+
pdf_reader = PyPDF2.PdfReader(f)
|
| 308 |
+
content = []
|
| 309 |
+
for page in pdf_reader.pages:
|
| 310 |
+
content.append(page.extract_text())
|
| 311 |
+
return '\n'.join(content)
|
| 312 |
+
except ImportError:
|
| 313 |
+
return "β PDF extraction requires PyPDF2. Install with: pip install PyPDF2"
|
| 314 |
+
except Exception as e:
|
| 315 |
+
return f"β Failed to extract text from PDF: {str(e)}"
|
| 316 |
+
|
| 317 |
+
elif file_ext in ['.doc', '.docx']:
|
| 318 |
+
# DOC/DOCX files - use python-docx
|
| 319 |
+
try:
|
| 320 |
+
from docx import Document
|
| 321 |
+
doc = Document(file_path)
|
| 322 |
+
content = []
|
| 323 |
+
for paragraph in doc.paragraphs:
|
| 324 |
+
content.append(paragraph.text)
|
| 325 |
+
return '\n'.join(content)
|
| 326 |
+
except ImportError:
|
| 327 |
+
return "β DOCX extraction requires python-docx. Install with: pip install python-docx"
|
| 328 |
+
except Exception as e:
|
| 329 |
+
return f"β Failed to extract text from DOCX: {str(e)}"
|
| 330 |
+
|
| 331 |
+
else:
|
| 332 |
+
return f"β Unsupported file type: {file_ext}. Supported: .txt, .pdf, .doc, .docx"
|
| 333 |
+
|
| 334 |
+
except Exception as e:
|
| 335 |
+
return f"β Error reading file: {str(e)}"
|
| 336 |
+
|
| 337 |
+
|
| 338 |
def add_admin_rules(tenant_id: str, rules_text: str) -> str:
|
| 339 |
if not tenant_id or not tenant_id.strip():
|
| 340 |
return "β Tenant ID is required."
|
|
|
|
| 342 |
return "β Provide at least one rule to upload."
|
| 343 |
|
| 344 |
tenant_id = tenant_id.strip()
|
| 345 |
+
# Filter out comment lines (starting with #) and empty lines
|
| 346 |
+
rules = [
|
| 347 |
+
rule.strip()
|
| 348 |
+
for rule in rules_text.splitlines()
|
| 349 |
+
if rule.strip() and not rule.strip().startswith("#")
|
| 350 |
+
]
|
| 351 |
if not rules:
|
| 352 |
+
return "β No valid rules detected. (Comment lines starting with # are ignored)"
|
| 353 |
|
| 354 |
added = []
|
| 355 |
+
enhanced = []
|
| 356 |
errors = []
|
| 357 |
+
|
| 358 |
+
# Process rules in chunks to avoid timeout
|
| 359 |
+
CHUNK_SIZE = 5 # Process 5 rules at a time
|
| 360 |
+
total_rules = len(rules)
|
| 361 |
+
|
| 362 |
+
if total_rules == 1:
|
| 363 |
+
# Single rule - use regular endpoint
|
| 364 |
try:
|
| 365 |
resp = requests.post(
|
| 366 |
f"{BACKEND_BASE_URL}/admin/rules",
|
| 367 |
+
params={"rule": rules[0], "enhance": "true"},
|
| 368 |
headers={"x-tenant-id": tenant_id},
|
| 369 |
+
timeout=30
|
| 370 |
)
|
| 371 |
if resp.status_code == 200:
|
| 372 |
+
data = resp.json()
|
| 373 |
+
added.append(data.get("added_rule", rules[0]))
|
| 374 |
+
if data.get("enhanced"):
|
| 375 |
+
edge_cases = data.get("edge_cases", [])
|
| 376 |
+
improvements = data.get("improvements", [])
|
| 377 |
+
if edge_cases or improvements:
|
| 378 |
+
enhanced.append(f"**{data.get('added_rule', rules[0])}**:")
|
| 379 |
+
if improvements:
|
| 380 |
+
enhanced.append(f" β’ Improvements: {', '.join(improvements[:3])}")
|
| 381 |
+
if edge_cases:
|
| 382 |
+
enhanced.append(f" β’ Edge cases identified: {len(edge_cases)}")
|
| 383 |
else:
|
| 384 |
+
errors.append(f"{rules[0]} -> {resp.status_code}: {resp.text}")
|
| 385 |
except Exception as exc:
|
| 386 |
+
errors.append(f"{rules[0]} -> {exc}")
|
| 387 |
+
else:
|
| 388 |
+
# Multiple rules - process in chunks
|
| 389 |
+
for i in range(0, total_rules, CHUNK_SIZE):
|
| 390 |
+
chunk = rules[i:i + CHUNK_SIZE]
|
| 391 |
+
chunk_num = (i // CHUNK_SIZE) + 1
|
| 392 |
+
total_chunks = (total_rules + CHUNK_SIZE - 1) // CHUNK_SIZE
|
| 393 |
+
|
| 394 |
+
try:
|
| 395 |
+
resp = requests.post(
|
| 396 |
+
f"{BACKEND_BASE_URL}/admin/rules/bulk",
|
| 397 |
+
json={"rules": chunk},
|
| 398 |
+
headers={"x-tenant-id": tenant_id},
|
| 399 |
+
params={"enhance": "true"},
|
| 400 |
+
timeout=45 # Timeout per chunk (5 rules)
|
| 401 |
+
)
|
| 402 |
+
if resp.status_code == 200:
|
| 403 |
+
data = resp.json()
|
| 404 |
+
chunk_added = data.get("added_rules", [])
|
| 405 |
+
added.extend(chunk_added)
|
| 406 |
+
if data.get("enhanced"):
|
| 407 |
+
chunk_enhanced = data.get("enhancement_summary", [])
|
| 408 |
+
enhanced.extend([f"[Chunk {chunk_num}/{total_chunks}] {e}" for e in chunk_enhanced])
|
| 409 |
+
else:
|
| 410 |
+
errors.append(f"Chunk {chunk_num}/{total_chunks} failed: {resp.status_code}: {resp.text}")
|
| 411 |
+
except requests.exceptions.Timeout:
|
| 412 |
+
errors.append(f"Chunk {chunk_num}/{total_chunks} timed out after 45s")
|
| 413 |
+
except Exception as exc:
|
| 414 |
+
errors.append(f"Chunk {chunk_num}/{total_chunks} error: {exc}")
|
| 415 |
|
| 416 |
summary = []
|
| 417 |
if added:
|
| 418 |
+
summary.append(f"β
Added {len(added)}/{total_rules} rule(s):\n" + "\n".join([f"- {r}" for r in added[:10]]))
|
| 419 |
+
if len(added) > 10:
|
| 420 |
+
summary.append(f"... and {len(added) - 10} more")
|
| 421 |
+
if enhanced:
|
| 422 |
+
summary.append(f"\nπ€ LLM Enhancement Applied:\n" + "\n".join(enhanced[:5]))
|
| 423 |
+
if len(enhanced) > 5:
|
| 424 |
+
summary.append(f"... and {len(enhanced) - 5} more enhancements")
|
| 425 |
if errors:
|
| 426 |
+
summary.append("\nβ οΈ Errors:\n" + "\n".join(errors))
|
| 427 |
|
| 428 |
return "\n\n".join(summary) if summary else "No rules were added."
|
| 429 |
|
|
|
|
| 454 |
return f"β Unexpected error: {exc}"
|
| 455 |
|
| 456 |
|
| 457 |
+
def add_rules_from_file(tenant_id: str, file_path):
|
| 458 |
+
"""
|
| 459 |
+
Extract rules from uploaded file and add them.
|
| 460 |
+
"""
|
| 461 |
+
if not tenant_id or not tenant_id.strip():
|
| 462 |
+
return "β Tenant ID is required.", "π Click **Refresh Rules** to see existing entries.", []
|
| 463 |
+
|
| 464 |
+
if file_path is None:
|
| 465 |
+
return "β Please select a file to upload.", "π Click **Refresh Rules** to see existing entries.", []
|
| 466 |
+
|
| 467 |
+
# Extract text from file
|
| 468 |
+
extracted_text = extract_rules_from_file(file_path)
|
| 469 |
+
|
| 470 |
+
if extracted_text.startswith("β"):
|
| 471 |
+
# Error occurred during extraction
|
| 472 |
+
summary, rows = fetch_admin_rules(tenant_id)
|
| 473 |
+
return extracted_text, summary, rows
|
| 474 |
+
|
| 475 |
+
if not extracted_text or not extracted_text.strip():
|
| 476 |
+
summary, rows = fetch_admin_rules(tenant_id)
|
| 477 |
+
return "β No text could be extracted from the file.", summary, rows
|
| 478 |
+
|
| 479 |
+
# Add rules from extracted text
|
| 480 |
+
status = add_admin_rules(tenant_id, extracted_text)
|
| 481 |
+
summary, rows = fetch_admin_rules(tenant_id)
|
| 482 |
+
return status, summary, rows
|
| 483 |
+
|
| 484 |
+
|
| 485 |
def add_rules_and_refresh(tenant_id: str, rules_text: str):
|
| 486 |
status = add_admin_rules(tenant_id, rules_text)
|
| 487 |
summary, rows = fetch_admin_rules(tenant_id)
|
|
|
|
| 1141 |
"""
|
| 1142 |
)
|
| 1143 |
|
| 1144 |
+
# Event handlers for chat tab with streaming
|
| 1145 |
def send_message(message, tenant_id, history):
|
| 1146 |
+
# Clear message input immediately
|
| 1147 |
+
message_input_value = ""
|
| 1148 |
+
# Use streaming function which yields updates
|
| 1149 |
+
# Gradio will automatically handle the generator and update UI in real-time
|
| 1150 |
+
try:
|
| 1151 |
+
for updated_history in chat_with_agent(message, tenant_id, history):
|
| 1152 |
+
yield updated_history, message_input_value
|
| 1153 |
+
except Exception as e:
|
| 1154 |
+
# Fallback if streaming fails
|
| 1155 |
+
error_msg = f"Streaming error: {str(e)}"
|
| 1156 |
+
history.append({"role": "assistant", "content": error_msg})
|
| 1157 |
+
yield history, message_input_value
|
| 1158 |
|
| 1159 |
send_button.click(
|
| 1160 |
fn=send_message,
|
|
|
|
| 1614 |
### π‘οΈ Admin Rules & Regulations
|
| 1615 |
Upload or manage tenant-specific governance rules (red-flag patterns, compliance policies, etc.).
|
| 1616 |
|
| 1617 |
+
**Upload Methods:**
|
| 1618 |
+
- **Text Input:** Enter one rule per line in the text box
|
| 1619 |
+
- **File Upload:** Upload rules from TXT, PDF, DOC, or DOCX files
|
| 1620 |
+
|
| 1621 |
+
**Features:**
|
| 1622 |
+
- Rules are automatically enhanced by LLM (identifies edge cases, improves patterns)
|
| 1623 |
+
- Comment lines (starting with #) are automatically ignored
|
| 1624 |
+
- Use the delete box to remove an exact rule
|
| 1625 |
+
- Refresh anytime to view the latest rule set
|
| 1626 |
"""
|
| 1627 |
)
|
| 1628 |
|
|
|
|
| 1639 |
refresh_rules_button = gr.Button("Refresh Rules", variant="secondary")
|
| 1640 |
gr.Markdown("")
|
| 1641 |
|
| 1642 |
+
with gr.Row():
|
| 1643 |
+
with gr.Column(scale=1):
|
| 1644 |
+
rules_input = gr.Textbox(
|
| 1645 |
+
label="Rules / Regulations (Text Input)",
|
| 1646 |
+
placeholder="Enter one rule per line...",
|
| 1647 |
+
lines=6
|
| 1648 |
+
)
|
| 1649 |
+
upload_rules_button = gr.Button("Upload / Append Rules", variant="primary")
|
| 1650 |
+
|
| 1651 |
+
with gr.Column(scale=1):
|
| 1652 |
+
gr.Markdown("**OR**")
|
| 1653 |
+
rules_file_upload = gr.File(
|
| 1654 |
+
label="Upload Rules File",
|
| 1655 |
+
file_types=[".txt", ".pdf", ".doc", ".docx"],
|
| 1656 |
+
type="filepath"
|
| 1657 |
+
)
|
| 1658 |
+
upload_file_button = gr.Button("Upload Rules from File", variant="primary")
|
| 1659 |
|
| 1660 |
delete_rule_input = gr.Textbox(
|
| 1661 |
label="Delete Rule",
|
|
|
|
| 1675 |
outputs=[rules_status, rules_summary, rules_table]
|
| 1676 |
)
|
| 1677 |
|
| 1678 |
+
upload_file_button.click(
|
| 1679 |
+
fn=add_rules_from_file,
|
| 1680 |
+
inputs=[tenant_id_input, rules_file_upload],
|
| 1681 |
+
outputs=[rules_status, rules_summary, rules_table]
|
| 1682 |
+
)
|
| 1683 |
+
|
| 1684 |
delete_rule_button.click(
|
| 1685 |
fn=delete_rule_and_refresh,
|
| 1686 |
inputs=[tenant_id_input, delete_rule_input],
|
backend/api/routes/admin.py
CHANGED
|
@@ -1,15 +1,19 @@
|
|
| 1 |
-
from fastapi import APIRouter, Header, HTTPException, Query
|
| 2 |
from pydantic import BaseModel
|
| 3 |
from typing import List, Optional, Dict, Any
|
| 4 |
from datetime import datetime, timedelta
|
| 5 |
|
| 6 |
from backend.api.storage.rules_store import RulesStore
|
| 7 |
from backend.api.storage.analytics_store import AnalyticsStore
|
|
|
|
|
|
|
| 8 |
|
| 9 |
router = APIRouter()
|
| 10 |
|
| 11 |
-
|
|
|
|
| 12 |
analytics_store = AnalyticsStore()
|
|
|
|
| 13 |
|
| 14 |
|
| 15 |
class RulePayload(BaseModel):
|
|
@@ -61,10 +65,17 @@ async def get_redflag_rules(
|
|
| 61 |
async def add_redflag_rule(
|
| 62 |
payload: Optional[RulePayload] = None,
|
| 63 |
rule: Optional[str] = None,
|
| 64 |
-
x_tenant_id: str = Header(None)
|
|
|
|
| 65 |
):
|
| 66 |
"""
|
| 67 |
-
Adds a new red-flag rule to this tenant
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 68 |
Accepts either JSON body or query parameter ?rule=...
|
| 69 |
JSON body supports: rule, pattern (regex), severity (low/medium/high/critical), description, enabled
|
| 70 |
"""
|
|
@@ -80,32 +91,76 @@ async def add_redflag_rule(
|
|
| 80 |
if not rule_value:
|
| 81 |
raise HTTPException(status_code=400, detail="Rule cannot be empty")
|
| 82 |
|
| 83 |
-
#
|
| 84 |
-
|
| 85 |
-
|
| 86 |
-
|
| 87 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 88 |
|
| 89 |
# Validate severity
|
| 90 |
-
if
|
| 91 |
-
|
|
|
|
|
|
|
| 92 |
|
|
|
|
| 93 |
rules_store.add_rule(
|
| 94 |
x_tenant_id,
|
| 95 |
-
|
| 96 |
-
pattern=
|
| 97 |
-
severity=
|
| 98 |
-
description=
|
| 99 |
enabled=enabled
|
| 100 |
)
|
| 101 |
rules = get_rules_for_tenant(x_tenant_id)
|
| 102 |
|
| 103 |
return {
|
| 104 |
"tenant_id": x_tenant_id,
|
| 105 |
-
"
|
| 106 |
-
"
|
| 107 |
-
"
|
| 108 |
-
"
|
|
|
|
|
|
|
|
|
|
|
|
|
| 109 |
"rules": rules
|
| 110 |
}
|
| 111 |
|
|
@@ -113,10 +168,16 @@ async def add_redflag_rule(
|
|
| 113 |
@router.post("/rules/bulk")
|
| 114 |
async def add_redflag_rules_bulk(
|
| 115 |
payload: BulkRulePayload,
|
| 116 |
-
x_tenant_id: str = Header(None)
|
|
|
|
| 117 |
):
|
| 118 |
"""
|
| 119 |
Adds multiple rules in one call.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 120 |
"""
|
| 121 |
if not x_tenant_id:
|
| 122 |
raise HTTPException(status_code=400, detail="Missing tenant ID")
|
|
@@ -124,13 +185,85 @@ async def add_redflag_rules_bulk(
|
|
| 124 |
if not payload.rules:
|
| 125 |
raise HTTPException(status_code=400, detail="No rules provided")
|
| 126 |
|
| 127 |
-
|
| 128 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 129 |
rules = get_rules_for_tenant(x_tenant_id)
|
| 130 |
|
| 131 |
return {
|
| 132 |
"tenant_id": x_tenant_id,
|
| 133 |
"added_rules": added,
|
|
|
|
|
|
|
| 134 |
"rules": rules
|
| 135 |
}
|
| 136 |
|
|
@@ -220,6 +353,165 @@ async def get_tool_logs(
|
|
| 220 |
}
|
| 221 |
|
| 222 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 223 |
@router.get("/tenants")
|
| 224 |
async def list_tenants():
|
| 225 |
"""
|
|
|
|
| 1 |
+
from fastapi import APIRouter, Header, HTTPException, Query, UploadFile, File
|
| 2 |
from pydantic import BaseModel
|
| 3 |
from typing import List, Optional, Dict, Any
|
| 4 |
from datetime import datetime, timedelta
|
| 5 |
|
| 6 |
from backend.api.storage.rules_store import RulesStore
|
| 7 |
from backend.api.storage.analytics_store import AnalyticsStore
|
| 8 |
+
from backend.api.services.rule_enhancer import RuleEnhancer
|
| 9 |
+
from backend.api.services.document_ingestion import extract_text_from_file_bytes
|
| 10 |
|
| 11 |
router = APIRouter()
|
| 12 |
|
| 13 |
+
# Initialize stores (table creation disabled by default to avoid blocking startup)
|
| 14 |
+
rules_store = RulesStore(auto_create_table=False)
|
| 15 |
analytics_store = AnalyticsStore()
|
| 16 |
+
rule_enhancer = RuleEnhancer()
|
| 17 |
|
| 18 |
|
| 19 |
class RulePayload(BaseModel):
|
|
|
|
| 65 |
async def add_redflag_rule(
|
| 66 |
payload: Optional[RulePayload] = None,
|
| 67 |
rule: Optional[str] = None,
|
| 68 |
+
x_tenant_id: str = Header(None),
|
| 69 |
+
enhance: bool = Query(True, description="Use LLM to enhance the rule before saving")
|
| 70 |
):
|
| 71 |
"""
|
| 72 |
+
Adds a new red-flag rule to this tenant.
|
| 73 |
+
|
| 74 |
+
Flow:
|
| 75 |
+
1. Fetch existing rules for context
|
| 76 |
+
2. Use LLM to analyze and enhance the rule (identify edge cases, improve pattern)
|
| 77 |
+
3. Save enhanced rule to database
|
| 78 |
+
|
| 79 |
Accepts either JSON body or query parameter ?rule=...
|
| 80 |
JSON body supports: rule, pattern (regex), severity (low/medium/high/critical), description, enabled
|
| 81 |
"""
|
|
|
|
| 91 |
if not rule_value:
|
| 92 |
raise HTTPException(status_code=400, detail="Rule cannot be empty")
|
| 93 |
|
| 94 |
+
# Step 1: Get existing rules for context
|
| 95 |
+
existing_rules = rules_store.get_rules(x_tenant_id)
|
| 96 |
+
|
| 97 |
+
# Step 2: Enhance rule using LLM (if enhance=True and LLM available)
|
| 98 |
+
enhanced_data = None
|
| 99 |
+
if enhance:
|
| 100 |
+
try:
|
| 101 |
+
enhanced_data = await rule_enhancer.enhance_rule(
|
| 102 |
+
rule_value,
|
| 103 |
+
existing_rules=existing_rules if existing_rules else None,
|
| 104 |
+
context=payload.description if payload and payload.description else None
|
| 105 |
+
)
|
| 106 |
+
except Exception as e:
|
| 107 |
+
# If enhancement fails, continue with original rule
|
| 108 |
+
print(f"Rule enhancement failed: {e}, using original rule")
|
| 109 |
+
enhanced_data = None
|
| 110 |
+
|
| 111 |
+
# Step 3: Use enhanced data if available, otherwise use provided values
|
| 112 |
+
if enhanced_data:
|
| 113 |
+
final_rule = enhanced_data["rule"]
|
| 114 |
+
final_pattern = enhanced_data["pattern"]
|
| 115 |
+
final_severity = enhanced_data["severity"]
|
| 116 |
+
final_description = enhanced_data["description"]
|
| 117 |
+
edge_cases = enhanced_data.get("edge_cases", [])
|
| 118 |
+
improvements = enhanced_data.get("improvements", [])
|
| 119 |
+
else:
|
| 120 |
+
# Use provided values or defaults
|
| 121 |
+
final_rule = rule_value
|
| 122 |
+
final_pattern = payload.pattern if payload else None
|
| 123 |
+
final_severity = payload.severity if payload else "medium"
|
| 124 |
+
final_description = payload.description if payload else None
|
| 125 |
+
edge_cases = []
|
| 126 |
+
improvements = []
|
| 127 |
+
|
| 128 |
+
# Override with explicit values if provided (user has final say)
|
| 129 |
+
if payload:
|
| 130 |
+
if payload.pattern:
|
| 131 |
+
final_pattern = payload.pattern
|
| 132 |
+
if payload.severity:
|
| 133 |
+
final_severity = payload.severity
|
| 134 |
+
if payload.description:
|
| 135 |
+
final_description = payload.description
|
| 136 |
|
| 137 |
# Validate severity
|
| 138 |
+
if final_severity not in ["low", "medium", "high", "critical"]:
|
| 139 |
+
final_severity = "medium"
|
| 140 |
+
|
| 141 |
+
enabled = payload.enabled if payload else True
|
| 142 |
|
| 143 |
+
# Step 4: Save to database
|
| 144 |
rules_store.add_rule(
|
| 145 |
x_tenant_id,
|
| 146 |
+
final_rule,
|
| 147 |
+
pattern=final_pattern,
|
| 148 |
+
severity=final_severity,
|
| 149 |
+
description=final_description,
|
| 150 |
enabled=enabled
|
| 151 |
)
|
| 152 |
rules = get_rules_for_tenant(x_tenant_id)
|
| 153 |
|
| 154 |
return {
|
| 155 |
"tenant_id": x_tenant_id,
|
| 156 |
+
"original_rule": rule_value,
|
| 157 |
+
"added_rule": final_rule,
|
| 158 |
+
"pattern": final_pattern or final_rule,
|
| 159 |
+
"severity": final_severity,
|
| 160 |
+
"description": final_description or final_rule,
|
| 161 |
+
"enhanced": enhanced_data is not None,
|
| 162 |
+
"edge_cases": edge_cases,
|
| 163 |
+
"improvements": improvements,
|
| 164 |
"rules": rules
|
| 165 |
}
|
| 166 |
|
|
|
|
| 168 |
@router.post("/rules/bulk")
|
| 169 |
async def add_redflag_rules_bulk(
|
| 170 |
payload: BulkRulePayload,
|
| 171 |
+
x_tenant_id: str = Header(None),
|
| 172 |
+
enhance: bool = Query(True, description="Use LLM to enhance rules before saving")
|
| 173 |
):
|
| 174 |
"""
|
| 175 |
Adds multiple rules in one call.
|
| 176 |
+
|
| 177 |
+
Flow:
|
| 178 |
+
1. Fetch existing rules for context
|
| 179 |
+
2. Use LLM to enhance each rule (identify edge cases, improve patterns)
|
| 180 |
+
3. Save enhanced rules to database
|
| 181 |
"""
|
| 182 |
if not x_tenant_id:
|
| 183 |
raise HTTPException(status_code=400, detail="Missing tenant ID")
|
|
|
|
| 185 |
if not payload.rules:
|
| 186 |
raise HTTPException(status_code=400, detail="No rules provided")
|
| 187 |
|
| 188 |
+
# Filter out comment lines (starting with #) and empty lines
|
| 189 |
+
cleaned = [
|
| 190 |
+
rule.strip()
|
| 191 |
+
for rule in payload.rules
|
| 192 |
+
if rule.strip() and not rule.strip().startswith("#")
|
| 193 |
+
]
|
| 194 |
+
|
| 195 |
+
# Step 1: Get existing rules for context
|
| 196 |
+
existing_rules = rules_store.get_rules(x_tenant_id)
|
| 197 |
+
|
| 198 |
+
# Step 2: Enhance rules using LLM (with chunk protection)
|
| 199 |
+
enhanced_rules_data = []
|
| 200 |
+
if enhance:
|
| 201 |
+
try:
|
| 202 |
+
# Process rules in chunks to avoid timeout
|
| 203 |
+
# Frontend already chunks to 5, but backend adds extra safety
|
| 204 |
+
CHUNK_SIZE = 5
|
| 205 |
+
for i in range(0, len(cleaned), CHUNK_SIZE):
|
| 206 |
+
chunk = cleaned[i:i + CHUNK_SIZE]
|
| 207 |
+
try:
|
| 208 |
+
chunk_enhanced = await rule_enhancer.enhance_rules_bulk(
|
| 209 |
+
chunk,
|
| 210 |
+
existing_rules=existing_rules if existing_rules else None
|
| 211 |
+
)
|
| 212 |
+
enhanced_rules_data.extend(chunk_enhanced)
|
| 213 |
+
except Exception as e:
|
| 214 |
+
print(f"Chunk {i//CHUNK_SIZE + 1} enhancement failed: {e}")
|
| 215 |
+
# Add fallback rules for this chunk
|
| 216 |
+
for rule in chunk:
|
| 217 |
+
enhanced_rules_data.append({
|
| 218 |
+
"rule": rule,
|
| 219 |
+
"pattern": rule,
|
| 220 |
+
"description": rule,
|
| 221 |
+
"severity": "medium",
|
| 222 |
+
"edge_cases": [],
|
| 223 |
+
"improvements": ["Enhancement failed for this chunk"],
|
| 224 |
+
"keywords": []
|
| 225 |
+
})
|
| 226 |
+
except Exception as e:
|
| 227 |
+
print(f"Bulk rule enhancement failed: {e}, using original rules")
|
| 228 |
+
enhanced_rules_data = []
|
| 229 |
+
|
| 230 |
+
# Step 3: Save enhanced rules to database
|
| 231 |
+
added = []
|
| 232 |
+
enhancement_summary = []
|
| 233 |
+
|
| 234 |
+
for i, rule in enumerate(cleaned):
|
| 235 |
+
try:
|
| 236 |
+
if enhanced_rules_data and i < len(enhanced_rules_data):
|
| 237 |
+
enhanced = enhanced_rules_data[i]
|
| 238 |
+
success = rules_store.add_rule(
|
| 239 |
+
x_tenant_id,
|
| 240 |
+
enhanced["rule"],
|
| 241 |
+
pattern=enhanced["pattern"],
|
| 242 |
+
severity=enhanced["severity"],
|
| 243 |
+
description=enhanced["description"],
|
| 244 |
+
enabled=True
|
| 245 |
+
)
|
| 246 |
+
if success:
|
| 247 |
+
added.append(enhanced["rule"])
|
| 248 |
+
if enhanced.get("improvements") and "failed" not in str(enhanced.get("improvements", [])).lower():
|
| 249 |
+
enhancement_summary.append(f"{enhanced['rule']}: {', '.join(enhanced['improvements'][:2])}")
|
| 250 |
+
else:
|
| 251 |
+
# Fallback to original rule
|
| 252 |
+
success = rules_store.add_rule(x_tenant_id, rule)
|
| 253 |
+
if success:
|
| 254 |
+
added.append(rule)
|
| 255 |
+
except Exception as e:
|
| 256 |
+
print(f"Error saving rule {i+1}: {e}")
|
| 257 |
+
# Continue with next rule
|
| 258 |
+
continue
|
| 259 |
+
|
| 260 |
rules = get_rules_for_tenant(x_tenant_id)
|
| 261 |
|
| 262 |
return {
|
| 263 |
"tenant_id": x_tenant_id,
|
| 264 |
"added_rules": added,
|
| 265 |
+
"enhanced": len(enhanced_rules_data) > 0,
|
| 266 |
+
"enhancement_summary": enhancement_summary,
|
| 267 |
"rules": rules
|
| 268 |
}
|
| 269 |
|
|
|
|
| 353 |
}
|
| 354 |
|
| 355 |
|
| 356 |
+
@router.post("/rules/upload-file")
|
| 357 |
+
async def upload_rules_from_file(
|
| 358 |
+
file: UploadFile = File(...),
|
| 359 |
+
x_tenant_id: str = Header(None),
|
| 360 |
+
enhance: bool = Query(True, description="Use LLM to enhance rules before saving")
|
| 361 |
+
):
|
| 362 |
+
"""
|
| 363 |
+
Upload rules from a file (TXT, PDF, DOC, DOCX).
|
| 364 |
+
Extracts text from the file and processes rules.
|
| 365 |
+
"""
|
| 366 |
+
if not x_tenant_id:
|
| 367 |
+
raise HTTPException(status_code=400, detail="Missing tenant ID")
|
| 368 |
+
|
| 369 |
+
if not file.filename:
|
| 370 |
+
raise HTTPException(status_code=400, detail="No file provided")
|
| 371 |
+
|
| 372 |
+
file_ext = file.filename.split('.')[-1].lower() if '.' in file.filename else ''
|
| 373 |
+
if file_ext not in ['txt', 'pdf', 'doc', 'docx', 'md']:
|
| 374 |
+
raise HTTPException(
|
| 375 |
+
status_code=400,
|
| 376 |
+
detail=f"Unsupported file type: {file_ext}. Supported: TXT, PDF, DOC, DOCX, MD"
|
| 377 |
+
)
|
| 378 |
+
|
| 379 |
+
try:
|
| 380 |
+
# Read file bytes
|
| 381 |
+
file_bytes = await file.read()
|
| 382 |
+
if not file_bytes:
|
| 383 |
+
raise HTTPException(status_code=400, detail="File is empty")
|
| 384 |
+
|
| 385 |
+
# Extract text from file
|
| 386 |
+
try:
|
| 387 |
+
extracted_text = extract_text_from_file_bytes(file_bytes, file.filename)
|
| 388 |
+
except ValueError as e:
|
| 389 |
+
raise HTTPException(status_code=400, detail=str(e))
|
| 390 |
+
|
| 391 |
+
if not extracted_text or not extracted_text.strip():
|
| 392 |
+
raise HTTPException(status_code=400, detail="No text could be extracted from file")
|
| 393 |
+
|
| 394 |
+
# Parse rules (filter comments and empty lines)
|
| 395 |
+
rules = [
|
| 396 |
+
rule.strip()
|
| 397 |
+
for rule in extracted_text.splitlines()
|
| 398 |
+
if rule.strip() and not rule.strip().startswith("#")
|
| 399 |
+
]
|
| 400 |
+
|
| 401 |
+
if not rules:
|
| 402 |
+
raise HTTPException(status_code=400, detail="No valid rules found in file (after filtering comments)")
|
| 403 |
+
|
| 404 |
+
# Get existing rules for context
|
| 405 |
+
existing_rules = rules_store.get_rules(x_tenant_id)
|
| 406 |
+
|
| 407 |
+
# Enhance rules using LLM
|
| 408 |
+
enhanced_rules_data = []
|
| 409 |
+
if enhance:
|
| 410 |
+
try:
|
| 411 |
+
CHUNK_SIZE = 5
|
| 412 |
+
for i in range(0, len(rules), CHUNK_SIZE):
|
| 413 |
+
chunk = rules[i:i + CHUNK_SIZE]
|
| 414 |
+
try:
|
| 415 |
+
chunk_enhanced = await rule_enhancer.enhance_rules_bulk(
|
| 416 |
+
chunk,
|
| 417 |
+
existing_rules=existing_rules if existing_rules else None
|
| 418 |
+
)
|
| 419 |
+
enhanced_rules_data.extend(chunk_enhanced)
|
| 420 |
+
except Exception as e:
|
| 421 |
+
print(f"Chunk {i//CHUNK_SIZE + 1} enhancement failed: {e}")
|
| 422 |
+
for rule in chunk:
|
| 423 |
+
enhanced_rules_data.append({
|
| 424 |
+
"rule": rule,
|
| 425 |
+
"pattern": rule,
|
| 426 |
+
"description": rule,
|
| 427 |
+
"severity": "medium",
|
| 428 |
+
"edge_cases": [],
|
| 429 |
+
"improvements": ["Enhancement failed for this chunk"],
|
| 430 |
+
"keywords": []
|
| 431 |
+
})
|
| 432 |
+
except Exception as e:
|
| 433 |
+
print(f"Bulk rule enhancement failed: {e}, using original rules")
|
| 434 |
+
enhanced_rules_data = []
|
| 435 |
+
|
| 436 |
+
# Save rules to database
|
| 437 |
+
added = []
|
| 438 |
+
enhancement_summary = []
|
| 439 |
+
|
| 440 |
+
for i, rule in enumerate(rules):
|
| 441 |
+
try:
|
| 442 |
+
if enhanced_rules_data and i < len(enhanced_rules_data):
|
| 443 |
+
enhanced = enhanced_rules_data[i]
|
| 444 |
+
success = rules_store.add_rule(
|
| 445 |
+
x_tenant_id,
|
| 446 |
+
enhanced["rule"],
|
| 447 |
+
pattern=enhanced["pattern"],
|
| 448 |
+
severity=enhanced["severity"],
|
| 449 |
+
description=enhanced["description"],
|
| 450 |
+
enabled=True
|
| 451 |
+
)
|
| 452 |
+
if success:
|
| 453 |
+
added.append(enhanced["rule"])
|
| 454 |
+
if enhanced.get("improvements") and "failed" not in str(enhanced.get("improvements", [])).lower():
|
| 455 |
+
enhancement_summary.append(f"{enhanced['rule']}: {', '.join(enhanced['improvements'][:2])}")
|
| 456 |
+
else:
|
| 457 |
+
success = rules_store.add_rule(x_tenant_id, rule)
|
| 458 |
+
if success:
|
| 459 |
+
added.append(rule)
|
| 460 |
+
except Exception as e:
|
| 461 |
+
print(f"Error saving rule {i+1}: {e}")
|
| 462 |
+
continue
|
| 463 |
+
|
| 464 |
+
rules_list = get_rules_for_tenant(x_tenant_id)
|
| 465 |
+
|
| 466 |
+
return {
|
| 467 |
+
"tenant_id": x_tenant_id,
|
| 468 |
+
"filename": file.filename,
|
| 469 |
+
"added_rules": added,
|
| 470 |
+
"total_extracted": len(rules),
|
| 471 |
+
"enhanced": len(enhanced_rules_data) > 0,
|
| 472 |
+
"enhancement_summary": enhancement_summary,
|
| 473 |
+
"rules": rules_list
|
| 474 |
+
}
|
| 475 |
+
|
| 476 |
+
except HTTPException:
|
| 477 |
+
raise
|
| 478 |
+
except Exception as e:
|
| 479 |
+
raise HTTPException(status_code=500, detail=f"Error processing file: {str(e)}")
|
| 480 |
+
|
| 481 |
+
|
| 482 |
+
@router.post("/setup/table")
|
| 483 |
+
async def create_supabase_table(x_tenant_id: str = Header(None)):
|
| 484 |
+
"""
|
| 485 |
+
Create the admin_rules table in Supabase if it doesn't exist.
|
| 486 |
+
This endpoint can be called after startup to set up the table.
|
| 487 |
+
"""
|
| 488 |
+
try:
|
| 489 |
+
if rules_store.use_supabase:
|
| 490 |
+
success = rules_store.create_table_if_needed()
|
| 491 |
+
if success:
|
| 492 |
+
return {
|
| 493 |
+
"status": "success",
|
| 494 |
+
"message": "Table created successfully",
|
| 495 |
+
"table": "admin_rules"
|
| 496 |
+
}
|
| 497 |
+
else:
|
| 498 |
+
return {
|
| 499 |
+
"status": "manual_required",
|
| 500 |
+
"message": "Automatic table creation failed. Please run SQL manually.",
|
| 501 |
+
"instructions": "Go to Supabase Dashboard β SQL Editor β Run supabase_admin_rules_table.sql"
|
| 502 |
+
}
|
| 503 |
+
else:
|
| 504 |
+
return {
|
| 505 |
+
"status": "not_applicable",
|
| 506 |
+
"message": "Using SQLite, not Supabase. Table creation not needed."
|
| 507 |
+
}
|
| 508 |
+
except Exception as e:
|
| 509 |
+
return {
|
| 510 |
+
"status": "error",
|
| 511 |
+
"message": str(e)
|
| 512 |
+
}
|
| 513 |
+
|
| 514 |
+
|
| 515 |
@router.get("/tenants")
|
| 516 |
async def list_tenants():
|
| 517 |
"""
|
backend/api/routes/agent.py
CHANGED
|
@@ -3,10 +3,13 @@
|
|
| 3 |
# =============================================================
|
| 4 |
|
| 5 |
from fastapi import APIRouter
|
|
|
|
| 6 |
from pydantic import BaseModel
|
| 7 |
import os
|
| 8 |
import sys
|
|
|
|
| 9 |
from pathlib import Path
|
|
|
|
| 10 |
|
| 11 |
# Add backend to path for imports
|
| 12 |
backend_dir = Path(__file__).parent.parent.parent
|
|
@@ -47,6 +50,160 @@ async def agent_chat(req: ChatRequest):
|
|
| 47 |
return await orchestrator.handle(agent_req)
|
| 48 |
|
| 49 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 50 |
@router.post("/debug")
|
| 51 |
async def agent_debug(req: ChatRequest):
|
| 52 |
"""
|
|
|
|
| 3 |
# =============================================================
|
| 4 |
|
| 5 |
from fastapi import APIRouter
|
| 6 |
+
from fastapi.responses import StreamingResponse
|
| 7 |
from pydantic import BaseModel
|
| 8 |
import os
|
| 9 |
import sys
|
| 10 |
+
import json
|
| 11 |
from pathlib import Path
|
| 12 |
+
from typing import AsyncGenerator
|
| 13 |
|
| 14 |
# Add backend to path for imports
|
| 15 |
backend_dir = Path(__file__).parent.parent.parent
|
|
|
|
| 50 |
return await orchestrator.handle(agent_req)
|
| 51 |
|
| 52 |
|
| 53 |
+
@router.post("/message/stream")
|
| 54 |
+
async def agent_chat_stream(req: ChatRequest):
|
| 55 |
+
"""Stream agent response word by word using Server-Sent Events."""
|
| 56 |
+
agent_req = AgentRequest(
|
| 57 |
+
tenant_id=req.tenant_id,
|
| 58 |
+
user_id=req.user_id,
|
| 59 |
+
message=req.message,
|
| 60 |
+
conversation_history=req.conversation_history,
|
| 61 |
+
temperature=req.temperature
|
| 62 |
+
)
|
| 63 |
+
|
| 64 |
+
async def generate_stream() -> AsyncGenerator[str, None]:
|
| 65 |
+
"""Generate streaming response."""
|
| 66 |
+
try:
|
| 67 |
+
# FIRST: Check admin rules - if any rule matches, respond according to rule
|
| 68 |
+
yield f"data: {json.dumps({'status': 'processing', 'message': 'Checking rules...'})}\n\n"
|
| 69 |
+
matches = await orchestrator.redflag.check(agent_req.tenant_id, agent_req.message)
|
| 70 |
+
|
| 71 |
+
if matches:
|
| 72 |
+
# Categorize rules: brief response rules vs blocking rules
|
| 73 |
+
brief_response_rules = []
|
| 74 |
+
blocking_rules = []
|
| 75 |
+
|
| 76 |
+
for match in matches:
|
| 77 |
+
rule_text = (match.description or match.pattern or "").lower()
|
| 78 |
+
is_brief_rule = (
|
| 79 |
+
match.severity == "low" and (
|
| 80 |
+
"greeting" in rule_text or
|
| 81 |
+
"brief" in rule_text or
|
| 82 |
+
"simple response" in rule_text or
|
| 83 |
+
"keep.*response.*brief" in rule_text or
|
| 84 |
+
"do not.*verbose" in rule_text or
|
| 85 |
+
"respond.*briefly" in rule_text
|
| 86 |
+
)
|
| 87 |
+
)
|
| 88 |
+
|
| 89 |
+
if is_brief_rule:
|
| 90 |
+
brief_response_rules.append(match)
|
| 91 |
+
else:
|
| 92 |
+
blocking_rules.append(match)
|
| 93 |
+
|
| 94 |
+
# Handle brief response rules (greetings, etc.) - return immediately
|
| 95 |
+
if brief_response_rules and not blocking_rules:
|
| 96 |
+
brief_responses = [
|
| 97 |
+
"Hello! How can I help you today?",
|
| 98 |
+
"Hi there! What can I assist you with?",
|
| 99 |
+
"Hello! I'm here to help. What do you need?",
|
| 100 |
+
"Hi! How can I assist you?"
|
| 101 |
+
]
|
| 102 |
+
import random
|
| 103 |
+
brief_response = random.choice(brief_responses)
|
| 104 |
+
|
| 105 |
+
# Stream the brief response word by word
|
| 106 |
+
yield f"data: {json.dumps({'status': 'streaming', 'message': ''})}\n\n"
|
| 107 |
+
words = brief_response.split()
|
| 108 |
+
for word in words:
|
| 109 |
+
yield f"data: {json.dumps({'token': word + ' ', 'done': False})}\n\n"
|
| 110 |
+
yield f"data: {json.dumps({'token': '', 'done': True})}\n\n"
|
| 111 |
+
return
|
| 112 |
+
|
| 113 |
+
# Handle blocking rules (security, compliance, etc.)
|
| 114 |
+
if blocking_rules:
|
| 115 |
+
matches = blocking_rules
|
| 116 |
+
|
| 117 |
+
if matches:
|
| 118 |
+
# For red flags, generate streaming response via LLM
|
| 119 |
+
violations_details = []
|
| 120 |
+
for i, m in enumerate(matches, 1):
|
| 121 |
+
rule_name = m.description or m.pattern or "Policy violation"
|
| 122 |
+
detail = f"{i}. **{rule_name}** (Severity: {m.severity})"
|
| 123 |
+
if m.matched_text:
|
| 124 |
+
detail += f"\n - Detected phrase: \"{m.matched_text}\""
|
| 125 |
+
violations_details.append(detail)
|
| 126 |
+
|
| 127 |
+
llm_prompt = f"""A user made the following request: "{agent_req.message}"
|
| 128 |
+
|
| 129 |
+
However, this request violates company policies. The following policy violations were detected:
|
| 130 |
+
|
| 131 |
+
{chr(10).join(violations_details)}
|
| 132 |
+
|
| 133 |
+
Your task: Write a clear, professional, and empathetic response to inform the user that:
|
| 134 |
+
1. Their request cannot be processed due to policy violations
|
| 135 |
+
2. Which specific policy was violated (mention it naturally)
|
| 136 |
+
3. The incident has been logged for security review
|
| 137 |
+
4. They should contact an administrator if they need assistance or believe this is an error
|
| 138 |
+
|
| 139 |
+
Write a natural, conversational response (2-4 sentences) that feels helpful rather than robotic. Be professional but understanding.
|
| 140 |
+
|
| 141 |
+
Response:"""
|
| 142 |
+
|
| 143 |
+
async for token in orchestrator.llm.stream_call(llm_prompt, agent_req.temperature):
|
| 144 |
+
yield f"data: {json.dumps({'token': token, 'done': False})}\n\n"
|
| 145 |
+
|
| 146 |
+
yield f"data: {json.dumps({'token': '', 'done': True})}\n\n"
|
| 147 |
+
return
|
| 148 |
+
|
| 149 |
+
# STEP 2: ONLY IF NO RULES MATCHED - Proceed with normal flow
|
| 150 |
+
yield f"data: {json.dumps({'status': 'classifying', 'message': 'Understanding your question...'})}\n\n"
|
| 151 |
+
intent = await orchestrator.intent.classify(agent_req.message)
|
| 152 |
+
|
| 153 |
+
# Pre-fetch RAG if needed
|
| 154 |
+
rag_results = []
|
| 155 |
+
if intent == "rag" or "rag" in intent.lower():
|
| 156 |
+
yield f"data: {json.dumps({'status': 'searching', 'message': 'Searching knowledge base...'})}\n\n"
|
| 157 |
+
try:
|
| 158 |
+
rag_prefetch = await orchestrator.mcp.call_rag(agent_req.tenant_id, agent_req.message)
|
| 159 |
+
if isinstance(rag_prefetch, dict):
|
| 160 |
+
rag_results = rag_prefetch.get("results") or rag_prefetch.get("hits") or []
|
| 161 |
+
except Exception:
|
| 162 |
+
pass
|
| 163 |
+
|
| 164 |
+
# Build prompt with context
|
| 165 |
+
if rag_results:
|
| 166 |
+
context = "\n\n".join([r.get("text", "")[:500] for r in rag_results[:3]])
|
| 167 |
+
prompt = f"""Based on the following context, answer the user's question:
|
| 168 |
+
|
| 169 |
+
Context:
|
| 170 |
+
{context}
|
| 171 |
+
|
| 172 |
+
User's question: {agent_req.message}
|
| 173 |
+
|
| 174 |
+
Answer:"""
|
| 175 |
+
else:
|
| 176 |
+
prompt = agent_req.message
|
| 177 |
+
|
| 178 |
+
# Signal that streaming is starting
|
| 179 |
+
yield f"data: {json.dumps({'status': 'streaming', 'message': ''})}\n\n"
|
| 180 |
+
|
| 181 |
+
# Stream LLM response - flush each token immediately
|
| 182 |
+
# Import asyncio for potential delays if needed
|
| 183 |
+
import asyncio
|
| 184 |
+
async for token in orchestrator.llm.stream_call(prompt, agent_req.temperature):
|
| 185 |
+
if token: # Only send non-empty tokens
|
| 186 |
+
yield f"data: {json.dumps({'token': token, 'done': False})}\n\n"
|
| 187 |
+
# Small delay to ensure proper flushing (optional, can remove if not needed)
|
| 188 |
+
await asyncio.sleep(0) # Yield control to event loop
|
| 189 |
+
|
| 190 |
+
yield f"data: {json.dumps({'token': '', 'done': True})}\n\n"
|
| 191 |
+
|
| 192 |
+
except Exception as e:
|
| 193 |
+
error_msg = json.dumps({'error': str(e), 'done': True})
|
| 194 |
+
yield f"data: {error_msg}\n\n"
|
| 195 |
+
|
| 196 |
+
return StreamingResponse(
|
| 197 |
+
generate_stream(),
|
| 198 |
+
media_type="text/event-stream",
|
| 199 |
+
headers={
|
| 200 |
+
"Cache-Control": "no-cache",
|
| 201 |
+
"Connection": "keep-alive",
|
| 202 |
+
"X-Accel-Buffering": "no"
|
| 203 |
+
}
|
| 204 |
+
)
|
| 205 |
+
|
| 206 |
+
|
| 207 |
@router.post("/debug")
|
| 208 |
async def agent_debug(req: ChatRequest):
|
| 209 |
"""
|
backend/api/services/agent_orchestrator.py
CHANGED
|
@@ -54,50 +54,110 @@ class AgentOrchestrator:
|
|
| 54 |
"message_preview": req.message[:120]
|
| 55 |
})
|
| 56 |
|
| 57 |
-
# 1)
|
| 58 |
matches: List[RedFlagMatch] = await self.redflag.check(req.tenant_id, req.message)
|
| 59 |
reasoning_trace.append({
|
| 60 |
-
"step": "
|
| 61 |
"match_count": len(matches),
|
| 62 |
"matches": [m.__dict__ for m in matches]
|
| 63 |
})
|
| 64 |
|
| 65 |
-
# Log red-flag violations
|
| 66 |
-
for match in matches:
|
| 67 |
-
self.analytics.log_redflag_violation(
|
| 68 |
-
tenant_id=req.tenant_id,
|
| 69 |
-
rule_id=match.rule_id,
|
| 70 |
-
rule_pattern=match.pattern,
|
| 71 |
-
severity=match.severity,
|
| 72 |
-
matched_text=match.matched_text,
|
| 73 |
-
confidence=match.confidence,
|
| 74 |
-
message_preview=req.message[:200],
|
| 75 |
-
user_id=req.user_id
|
| 76 |
-
)
|
| 77 |
-
|
| 78 |
if matches:
|
| 79 |
-
#
|
| 80 |
-
|
| 81 |
-
|
| 82 |
-
|
| 83 |
-
|
| 84 |
-
|
| 85 |
-
|
| 86 |
-
|
| 87 |
-
|
| 88 |
-
|
| 89 |
-
|
| 90 |
-
|
| 91 |
-
)
|
| 92 |
|
| 93 |
-
#
|
| 94 |
-
|
| 95 |
-
|
| 96 |
-
|
| 97 |
-
|
| 98 |
-
|
| 99 |
-
|
| 100 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 101 |
|
| 102 |
llm_prompt = f"""A user made the following request: "{req.message}"
|
| 103 |
|
|
@@ -157,11 +217,11 @@ Response:"""
|
|
| 157 |
return AgentResponse(
|
| 158 |
text=llm_response,
|
| 159 |
decision=decision,
|
| 160 |
-
tool_traces=[{"redflags": [m.__dict__ for m in
|
| 161 |
reasoning_trace=reasoning_trace
|
| 162 |
)
|
| 163 |
-
|
| 164 |
-
# 2)
|
| 165 |
intent = await self.intent.classify(req.message)
|
| 166 |
reasoning_trace.append({
|
| 167 |
"step": "intent_detection",
|
|
|
|
| 54 |
"message_preview": req.message[:120]
|
| 55 |
})
|
| 56 |
|
| 57 |
+
# 1) FIRST: Check admin rules - if any rule matches, respond according to rule
|
| 58 |
matches: List[RedFlagMatch] = await self.redflag.check(req.tenant_id, req.message)
|
| 59 |
reasoning_trace.append({
|
| 60 |
+
"step": "admin_rules_check",
|
| 61 |
"match_count": len(matches),
|
| 62 |
"matches": [m.__dict__ for m in matches]
|
| 63 |
})
|
| 64 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 65 |
if matches:
|
| 66 |
+
# Log all rule matches
|
| 67 |
+
for match in matches:
|
| 68 |
+
self.analytics.log_redflag_violation(
|
| 69 |
+
tenant_id=req.tenant_id,
|
| 70 |
+
rule_id=match.rule_id,
|
| 71 |
+
rule_pattern=match.pattern,
|
| 72 |
+
severity=match.severity,
|
| 73 |
+
matched_text=match.matched_text,
|
| 74 |
+
confidence=match.confidence,
|
| 75 |
+
message_preview=req.message[:200],
|
| 76 |
+
user_id=req.user_id
|
| 77 |
+
)
|
|
|
|
| 78 |
|
| 79 |
+
# Categorize rules: brief response rules vs blocking rules
|
| 80 |
+
brief_response_rules = []
|
| 81 |
+
blocking_rules = []
|
| 82 |
+
|
| 83 |
+
for match in matches:
|
| 84 |
+
rule_text = (match.description or match.pattern or "").lower()
|
| 85 |
+
is_brief_rule = (
|
| 86 |
+
match.severity == "low" and (
|
| 87 |
+
"greeting" in rule_text or
|
| 88 |
+
"brief" in rule_text or
|
| 89 |
+
"simple response" in rule_text or
|
| 90 |
+
"keep.*response.*brief" in rule_text or
|
| 91 |
+
"do not.*verbose" in rule_text or
|
| 92 |
+
"respond.*briefly" in rule_text
|
| 93 |
+
)
|
| 94 |
+
)
|
| 95 |
+
|
| 96 |
+
if is_brief_rule:
|
| 97 |
+
brief_response_rules.append(match)
|
| 98 |
+
else:
|
| 99 |
+
blocking_rules.append(match)
|
| 100 |
+
|
| 101 |
+
# Handle brief response rules (greetings, etc.) - return immediately
|
| 102 |
+
if brief_response_rules and not blocking_rules:
|
| 103 |
+
# Return brief response without proceeding to normal flow
|
| 104 |
+
brief_responses = [
|
| 105 |
+
"Hello! How can I help you today?",
|
| 106 |
+
"Hi there! What can I assist you with?",
|
| 107 |
+
"Hello! I'm here to help. What do you need?",
|
| 108 |
+
"Hi! How can I assist you?"
|
| 109 |
+
]
|
| 110 |
+
import random
|
| 111 |
+
brief_response = random.choice(brief_responses)
|
| 112 |
+
|
| 113 |
+
reasoning_trace.append({
|
| 114 |
+
"step": "brief_response_rule_matched",
|
| 115 |
+
"action": "brief_response",
|
| 116 |
+
"matched_rules": [m.rule_id for m in brief_response_rules],
|
| 117 |
+
"message": "Brief response rule matched, returning brief response (skipping normal flow)"
|
| 118 |
+
})
|
| 119 |
+
|
| 120 |
+
total_latency_ms = int((time.time() - start_time) * 1000)
|
| 121 |
+
self.analytics.log_agent_query(
|
| 122 |
+
tenant_id=req.tenant_id,
|
| 123 |
+
message_preview=req.message[:200],
|
| 124 |
+
intent="greeting",
|
| 125 |
+
tools_used=[],
|
| 126 |
+
total_tokens=len(brief_response) // 4,
|
| 127 |
+
total_latency_ms=total_latency_ms,
|
| 128 |
+
success=True,
|
| 129 |
+
user_id=req.user_id
|
| 130 |
+
)
|
| 131 |
+
|
| 132 |
+
return AgentResponse(
|
| 133 |
+
text=brief_response,
|
| 134 |
+
decision=AgentDecision(action="respond", tool=None, tool_input=None, reason="brief_response_rule"),
|
| 135 |
+
reasoning_trace=reasoning_trace
|
| 136 |
+
)
|
| 137 |
+
|
| 138 |
+
# Handle blocking rules (security, compliance, etc.) - block and return immediately
|
| 139 |
+
if blocking_rules:
|
| 140 |
+
# Notify admin asynchronously
|
| 141 |
+
try:
|
| 142 |
+
await self.redflag.notify_admin(req.tenant_id, blocking_rules, source_payload={"message": req.message, "user_id": req.user_id})
|
| 143 |
+
except Exception:
|
| 144 |
+
pass
|
| 145 |
+
|
| 146 |
+
decision = AgentDecision(
|
| 147 |
+
action="block",
|
| 148 |
+
tool="admin",
|
| 149 |
+
tool_input={"violations": [m.__dict__ for m in blocking_rules]},
|
| 150 |
+
reason="admin_rule_violation"
|
| 151 |
+
)
|
| 152 |
+
|
| 153 |
+
# Build detailed prompt for LLM to generate natural red flag response
|
| 154 |
+
violations_details = []
|
| 155 |
+
for i, m in enumerate(blocking_rules, 1):
|
| 156 |
+
rule_name = m.description or m.pattern or "Policy violation"
|
| 157 |
+
detail = f"{i}. **{rule_name}** (Severity: {m.severity})"
|
| 158 |
+
if m.matched_text:
|
| 159 |
+
detail += f"\n - Detected phrase: \"{m.matched_text}\""
|
| 160 |
+
violations_details.append(detail)
|
| 161 |
|
| 162 |
llm_prompt = f"""A user made the following request: "{req.message}"
|
| 163 |
|
|
|
|
| 217 |
return AgentResponse(
|
| 218 |
text=llm_response,
|
| 219 |
decision=decision,
|
| 220 |
+
tool_traces=[{"redflags": [m.__dict__ for m in blocking_rules]}],
|
| 221 |
reasoning_trace=reasoning_trace
|
| 222 |
)
|
| 223 |
+
|
| 224 |
+
# 2) ONLY IF NO RULES MATCHED: Proceed with normal flow (intent classification, RAG, etc.)
|
| 225 |
intent = await self.intent.classify(req.message)
|
| 226 |
reasoning_trace.append({
|
| 227 |
"step": "intent_detection",
|
backend/api/services/llm_client.py
CHANGED
|
@@ -1,5 +1,6 @@
|
|
| 1 |
import os, json
|
| 2 |
import httpx
|
|
|
|
| 3 |
|
| 4 |
|
| 5 |
class LLMClient:
|
|
@@ -51,3 +52,46 @@ class LLMClient:
|
|
| 51 |
except Exception as e:
|
| 52 |
raise RuntimeError(f"LLM call failed: {str(e)}")
|
| 53 |
raise RuntimeError("Unsupported backend")
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
import os, json
|
| 2 |
import httpx
|
| 3 |
+
from typing import AsyncGenerator
|
| 4 |
|
| 5 |
|
| 6 |
class LLMClient:
|
|
|
|
| 52 |
except Exception as e:
|
| 53 |
raise RuntimeError(f"LLM call failed: {str(e)}")
|
| 54 |
raise RuntimeError("Unsupported backend")
|
| 55 |
+
|
| 56 |
+
async def stream_call(self, prompt: str, temperature: float = 0.0) -> AsyncGenerator[str, None]:
|
| 57 |
+
"""Stream LLM response token by token."""
|
| 58 |
+
if self.backend == "ollama":
|
| 59 |
+
if not self.url or not self.model:
|
| 60 |
+
raise RuntimeError(f"LLM not configured: url={self.url}, model={self.model}")
|
| 61 |
+
|
| 62 |
+
try:
|
| 63 |
+
async with httpx.AsyncClient(timeout=300.0) as client:
|
| 64 |
+
async with client.stream(
|
| 65 |
+
"POST",
|
| 66 |
+
f"{self.url}/api/generate",
|
| 67 |
+
json={
|
| 68 |
+
"model": self.model,
|
| 69 |
+
"prompt": prompt,
|
| 70 |
+
"stream": True,
|
| 71 |
+
"options": {"temperature": temperature}
|
| 72 |
+
}
|
| 73 |
+
) as response:
|
| 74 |
+
response.raise_for_status()
|
| 75 |
+
async for line in response.aiter_lines():
|
| 76 |
+
if line:
|
| 77 |
+
try:
|
| 78 |
+
data = json.loads(line)
|
| 79 |
+
token = data.get("response", "")
|
| 80 |
+
if token:
|
| 81 |
+
yield token
|
| 82 |
+
# Check if done
|
| 83 |
+
if data.get("done", False):
|
| 84 |
+
break
|
| 85 |
+
except json.JSONDecodeError:
|
| 86 |
+
continue
|
| 87 |
+
# Yield empty string to keep connection alive if needed
|
| 88 |
+
# This helps with buffering issues
|
| 89 |
+
except httpx.ConnectError:
|
| 90 |
+
raise RuntimeError(
|
| 91 |
+
f"Cannot connect to Ollama at {self.url}. "
|
| 92 |
+
f"Is Ollama running? Start it with: ollama serve"
|
| 93 |
+
)
|
| 94 |
+
except Exception as e:
|
| 95 |
+
raise RuntimeError(f"LLM streaming failed: {str(e)}")
|
| 96 |
+
else:
|
| 97 |
+
raise RuntimeError("Streaming not supported for this backend")
|
backend/api/services/rule_enhancer.py
ADDED
|
@@ -0,0 +1,184 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
"""
|
| 2 |
+
Rule Enhancement Service using LLM
|
| 3 |
+
Analyzes rules for edge cases and improves them before saving to database.
|
| 4 |
+
"""
|
| 5 |
+
|
| 6 |
+
import os
|
| 7 |
+
from typing import List, Dict, Any, Optional
|
| 8 |
+
from ..services.llm_client import LLMClient
|
| 9 |
+
|
| 10 |
+
|
| 11 |
+
class RuleEnhancer:
|
| 12 |
+
"""
|
| 13 |
+
Uses LLM to analyze and enhance admin rules.
|
| 14 |
+
Identifies edge cases, improves patterns, and suggests better descriptions.
|
| 15 |
+
"""
|
| 16 |
+
|
| 17 |
+
def __init__(self, llm_client: Optional[LLMClient] = None):
|
| 18 |
+
self.llm = llm_client or LLMClient(
|
| 19 |
+
backend=os.getenv("LLM_BACKEND", "ollama"),
|
| 20 |
+
url=os.getenv("OLLAMA_URL"),
|
| 21 |
+
api_key=os.getenv("GROQ_API_KEY"),
|
| 22 |
+
model=os.getenv("OLLAMA_MODEL", "llama3.1:latest")
|
| 23 |
+
)
|
| 24 |
+
|
| 25 |
+
async def enhance_rule(
|
| 26 |
+
self,
|
| 27 |
+
rule_text: str,
|
| 28 |
+
existing_rules: Optional[List[str]] = None,
|
| 29 |
+
context: Optional[str] = None
|
| 30 |
+
) -> Dict[str, Any]:
|
| 31 |
+
"""
|
| 32 |
+
Enhance a single rule using LLM analysis.
|
| 33 |
+
|
| 34 |
+
Args:
|
| 35 |
+
rule_text: The original rule text
|
| 36 |
+
existing_rules: List of existing rules for context
|
| 37 |
+
context: Additional context about the rule
|
| 38 |
+
|
| 39 |
+
Returns:
|
| 40 |
+
Dictionary with enhanced rule data:
|
| 41 |
+
- rule: Enhanced rule text
|
| 42 |
+
- pattern: Improved regex pattern
|
| 43 |
+
- description: Better description
|
| 44 |
+
- severity: Suggested severity
|
| 45 |
+
- edge_cases: List of identified edge cases
|
| 46 |
+
- improvements: List of suggested improvements
|
| 47 |
+
"""
|
| 48 |
+
existing_context = ""
|
| 49 |
+
if existing_rules:
|
| 50 |
+
existing_context = "\n".join([f"- {r}" for r in existing_rules[:10]]) # Limit to 10 rules
|
| 51 |
+
|
| 52 |
+
context_text = f"\nAdditional context: {context}" if context else ""
|
| 53 |
+
|
| 54 |
+
prompt = f"""You are an expert in policy rule analysis and pattern matching. Analyze the following rule and enhance it.
|
| 55 |
+
|
| 56 |
+
Original Rule: "{rule_text}"
|
| 57 |
+
|
| 58 |
+
Existing Rules (for context):
|
| 59 |
+
{existing_context if existing_context else "None"}
|
| 60 |
+
{context_text}
|
| 61 |
+
|
| 62 |
+
Your task:
|
| 63 |
+
1. Analyze the rule for potential edge cases and improvements
|
| 64 |
+
2. Generate an improved regex pattern that catches more variations
|
| 65 |
+
3. Write a clear, comprehensive description
|
| 66 |
+
4. Suggest an appropriate severity level (low/medium/high/critical)
|
| 67 |
+
5. Identify edge cases that might be missed
|
| 68 |
+
6. Suggest improvements
|
| 69 |
+
|
| 70 |
+
Respond in JSON format with the following structure:
|
| 71 |
+
{{
|
| 72 |
+
"rule": "Enhanced rule text (improved version of original)",
|
| 73 |
+
"pattern": "Improved regex pattern (e.g., '.*password.*|.*pwd.*|.*passcode.*')",
|
| 74 |
+
"description": "Clear description of what this rule detects",
|
| 75 |
+
"severity": "low|medium|high|critical",
|
| 76 |
+
"edge_cases": ["Edge case 1", "Edge case 2", ...],
|
| 77 |
+
"improvements": ["Improvement 1", "Improvement 2", ...],
|
| 78 |
+
"keywords": ["keyword1", "keyword2", ...]
|
| 79 |
+
}}
|
| 80 |
+
|
| 81 |
+
Only return valid JSON, no additional text:"""
|
| 82 |
+
|
| 83 |
+
try:
|
| 84 |
+
# Add timeout protection - LLM calls can be slow
|
| 85 |
+
import asyncio
|
| 86 |
+
response = await asyncio.wait_for(
|
| 87 |
+
self.llm.simple_call(prompt, temperature=0.3),
|
| 88 |
+
timeout=30.0 # 30 second timeout per rule
|
| 89 |
+
)
|
| 90 |
+
|
| 91 |
+
# Clean up response - remove markdown code blocks if present
|
| 92 |
+
response = response.strip()
|
| 93 |
+
if response.startswith("```json"):
|
| 94 |
+
response = response[7:]
|
| 95 |
+
if response.startswith("```"):
|
| 96 |
+
response = response[3:]
|
| 97 |
+
if response.endswith("```"):
|
| 98 |
+
response = response[:-3]
|
| 99 |
+
response = response.strip()
|
| 100 |
+
|
| 101 |
+
import json
|
| 102 |
+
enhanced_data = json.loads(response)
|
| 103 |
+
|
| 104 |
+
# Ensure all required fields exist
|
| 105 |
+
result = {
|
| 106 |
+
"rule": enhanced_data.get("rule", rule_text),
|
| 107 |
+
"pattern": enhanced_data.get("pattern", rule_text),
|
| 108 |
+
"description": enhanced_data.get("description", rule_text),
|
| 109 |
+
"severity": enhanced_data.get("severity", "medium"),
|
| 110 |
+
"edge_cases": enhanced_data.get("edge_cases", []),
|
| 111 |
+
"improvements": enhanced_data.get("improvements", []),
|
| 112 |
+
"keywords": enhanced_data.get("keywords", [])
|
| 113 |
+
}
|
| 114 |
+
|
| 115 |
+
# Validate severity
|
| 116 |
+
if result["severity"] not in ["low", "medium", "high", "critical"]:
|
| 117 |
+
result["severity"] = "medium"
|
| 118 |
+
|
| 119 |
+
return result
|
| 120 |
+
|
| 121 |
+
except asyncio.TimeoutError:
|
| 122 |
+
# Timeout - return original rule
|
| 123 |
+
print(f"LLM enhancement timeout for rule: {rule_text[:50]}...")
|
| 124 |
+
return {
|
| 125 |
+
"rule": rule_text,
|
| 126 |
+
"pattern": rule_text,
|
| 127 |
+
"description": rule_text,
|
| 128 |
+
"severity": "medium",
|
| 129 |
+
"edge_cases": [],
|
| 130 |
+
"improvements": ["Enhancement timed out - using original rule"],
|
| 131 |
+
"keywords": []
|
| 132 |
+
}
|
| 133 |
+
except Exception as e:
|
| 134 |
+
# Fallback to original rule if LLM fails
|
| 135 |
+
print(f"LLM enhancement error: {e}")
|
| 136 |
+
return {
|
| 137 |
+
"rule": rule_text,
|
| 138 |
+
"pattern": rule_text,
|
| 139 |
+
"description": rule_text,
|
| 140 |
+
"severity": "medium",
|
| 141 |
+
"edge_cases": [],
|
| 142 |
+
"improvements": [f"Enhancement failed: {str(e)[:50]}"],
|
| 143 |
+
"keywords": []
|
| 144 |
+
}
|
| 145 |
+
|
| 146 |
+
async def enhance_rules_bulk(
|
| 147 |
+
self,
|
| 148 |
+
rules: List[str],
|
| 149 |
+
existing_rules: Optional[List[str]] = None
|
| 150 |
+
) -> List[Dict[str, Any]]:
|
| 151 |
+
"""
|
| 152 |
+
Enhance multiple rules at once.
|
| 153 |
+
Processes rules sequentially with error handling to avoid timeout.
|
| 154 |
+
|
| 155 |
+
Args:
|
| 156 |
+
rules: List of rule texts to enhance
|
| 157 |
+
existing_rules: List of existing rules for context
|
| 158 |
+
|
| 159 |
+
Returns:
|
| 160 |
+
List of enhanced rule dictionaries
|
| 161 |
+
"""
|
| 162 |
+
enhanced_rules = []
|
| 163 |
+
|
| 164 |
+
for i, rule in enumerate(rules):
|
| 165 |
+
try:
|
| 166 |
+
# Enhance each rule individually with timeout protection
|
| 167 |
+
enhanced = await self.enhance_rule(rule, existing_rules)
|
| 168 |
+
enhanced_rules.append(enhanced)
|
| 169 |
+
except Exception as e:
|
| 170 |
+
# If enhancement fails for one rule, use original rule
|
| 171 |
+
# This ensures other rules can still be processed
|
| 172 |
+
print(f"Warning: Rule {i+1}/{len(rules)} enhancement failed: {e}")
|
| 173 |
+
enhanced_rules.append({
|
| 174 |
+
"rule": rule,
|
| 175 |
+
"pattern": rule,
|
| 176 |
+
"description": rule,
|
| 177 |
+
"severity": "medium",
|
| 178 |
+
"edge_cases": [],
|
| 179 |
+
"improvements": [f"Enhancement skipped due to error"],
|
| 180 |
+
"keywords": []
|
| 181 |
+
})
|
| 182 |
+
|
| 183 |
+
return enhanced_rules
|
| 184 |
+
|
backend/api/storage/create_supabase_table.py
ADDED
|
@@ -0,0 +1,212 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
"""
|
| 2 |
+
Script to create admin_rules table in Supabase programmatically.
|
| 3 |
+
Run this script to set up the table in your Supabase project.
|
| 4 |
+
"""
|
| 5 |
+
|
| 6 |
+
import os
|
| 7 |
+
import sys
|
| 8 |
+
from pathlib import Path
|
| 9 |
+
|
| 10 |
+
# Add backend to path
|
| 11 |
+
backend_dir = Path(__file__).resolve().parents[2]
|
| 12 |
+
sys.path.insert(0, str(backend_dir))
|
| 13 |
+
|
| 14 |
+
try:
|
| 15 |
+
from supabase import create_client, Client
|
| 16 |
+
except ImportError:
|
| 17 |
+
print("β Supabase client not installed. Run: pip install supabase")
|
| 18 |
+
sys.exit(1)
|
| 19 |
+
|
| 20 |
+
|
| 21 |
+
def create_admin_rules_table():
|
| 22 |
+
"""
|
| 23 |
+
Create the admin_rules table in Supabase with all necessary columns,
|
| 24 |
+
indexes, RLS policies, and triggers.
|
| 25 |
+
"""
|
| 26 |
+
supabase_url = os.getenv("SUPABASE_URL")
|
| 27 |
+
supabase_key = os.getenv("SUPABASE_SERVICE_KEY")
|
| 28 |
+
|
| 29 |
+
if not supabase_url or not supabase_key:
|
| 30 |
+
print("β Supabase credentials missing!")
|
| 31 |
+
print(" Set SUPABASE_URL and SUPABASE_SERVICE_KEY in your .env file")
|
| 32 |
+
return False
|
| 33 |
+
|
| 34 |
+
try:
|
| 35 |
+
client = create_client(supabase_url, supabase_key)
|
| 36 |
+
|
| 37 |
+
print("π Connecting to Supabase...")
|
| 38 |
+
print(f" URL: {supabase_url}")
|
| 39 |
+
|
| 40 |
+
# SQL script to create the table
|
| 41 |
+
sql_script = """
|
| 42 |
+
-- Create admin_rules table
|
| 43 |
+
CREATE TABLE IF NOT EXISTS admin_rules (
|
| 44 |
+
id BIGSERIAL PRIMARY KEY,
|
| 45 |
+
tenant_id TEXT NOT NULL,
|
| 46 |
+
rule TEXT NOT NULL,
|
| 47 |
+
pattern TEXT,
|
| 48 |
+
severity TEXT DEFAULT 'medium' CHECK (severity IN ('low', 'medium', 'high', 'critical')),
|
| 49 |
+
description TEXT,
|
| 50 |
+
enabled BOOLEAN DEFAULT true,
|
| 51 |
+
created_at TIMESTAMPTZ DEFAULT NOW(),
|
| 52 |
+
updated_at TIMESTAMPTZ DEFAULT NOW(),
|
| 53 |
+
UNIQUE(tenant_id, rule)
|
| 54 |
+
);
|
| 55 |
+
|
| 56 |
+
-- Create indexes for faster queries
|
| 57 |
+
CREATE INDEX IF NOT EXISTS idx_admin_rules_tenant_id ON admin_rules(tenant_id);
|
| 58 |
+
CREATE INDEX IF NOT EXISTS idx_admin_rules_enabled ON admin_rules(enabled);
|
| 59 |
+
CREATE INDEX IF NOT EXISTS idx_admin_rules_tenant_enabled ON admin_rules(tenant_id, enabled);
|
| 60 |
+
|
| 61 |
+
-- Enable Row Level Security
|
| 62 |
+
ALTER TABLE admin_rules ENABLE ROW LEVEL SECURITY;
|
| 63 |
+
|
| 64 |
+
-- Drop existing policy if it exists (to avoid conflicts)
|
| 65 |
+
DROP POLICY IF EXISTS "Service role can manage all admin rules" ON admin_rules;
|
| 66 |
+
|
| 67 |
+
-- Create policy to allow service role to access all rows
|
| 68 |
+
CREATE POLICY "Service role can manage all admin rules"
|
| 69 |
+
ON admin_rules
|
| 70 |
+
FOR ALL
|
| 71 |
+
USING (true)
|
| 72 |
+
WITH CHECK (true);
|
| 73 |
+
|
| 74 |
+
-- Create function to automatically update updated_at timestamp
|
| 75 |
+
CREATE OR REPLACE FUNCTION update_updated_at_column()
|
| 76 |
+
RETURNS TRIGGER AS $$
|
| 77 |
+
BEGIN
|
| 78 |
+
NEW.updated_at = NOW();
|
| 79 |
+
RETURN NEW;
|
| 80 |
+
END;
|
| 81 |
+
$$ language 'plpgsql';
|
| 82 |
+
|
| 83 |
+
-- Drop existing trigger if it exists
|
| 84 |
+
DROP TRIGGER IF EXISTS update_admin_rules_updated_at ON admin_rules;
|
| 85 |
+
|
| 86 |
+
-- Create trigger to automatically update updated_at
|
| 87 |
+
CREATE TRIGGER update_admin_rules_updated_at
|
| 88 |
+
BEFORE UPDATE ON admin_rules
|
| 89 |
+
FOR EACH ROW
|
| 90 |
+
EXECUTE FUNCTION update_updated_at_column();
|
| 91 |
+
"""
|
| 92 |
+
|
| 93 |
+
print("\nπ Executing SQL to create admin_rules table...")
|
| 94 |
+
|
| 95 |
+
# Execute SQL using Supabase REST API
|
| 96 |
+
# Note: Supabase Python client doesn't have direct SQL execution
|
| 97 |
+
# We need to use the REST API or rpc function
|
| 98 |
+
|
| 99 |
+
# Alternative: Use the REST API to execute SQL
|
| 100 |
+
import httpx
|
| 101 |
+
import json
|
| 102 |
+
|
| 103 |
+
response = httpx.post(
|
| 104 |
+
f"{supabase_url}/rest/v1/rpc/exec_sql",
|
| 105 |
+
headers={
|
| 106 |
+
"apikey": supabase_key,
|
| 107 |
+
"Authorization": f"Bearer {supabase_key}",
|
| 108 |
+
"Content-Type": "application/json"
|
| 109 |
+
},
|
| 110 |
+
json={"query": sql_script},
|
| 111 |
+
timeout=30
|
| 112 |
+
)
|
| 113 |
+
|
| 114 |
+
if response.status_code in [200, 201, 204]:
|
| 115 |
+
print("β
Table created successfully!")
|
| 116 |
+
print("\nπ Verifying table exists...")
|
| 117 |
+
|
| 118 |
+
# Try to query the table to verify it exists
|
| 119 |
+
try:
|
| 120 |
+
result = client.table("admin_rules").select("id").limit(1).execute()
|
| 121 |
+
print("β
Table verified - admin_rules table exists and is accessible")
|
| 122 |
+
return True
|
| 123 |
+
except Exception as e:
|
| 124 |
+
# Table might exist but be empty, which is fine
|
| 125 |
+
if "relation" in str(e).lower() or "does not exist" in str(e).lower():
|
| 126 |
+
print("β οΈ Table might not have been created. Check Supabase dashboard.")
|
| 127 |
+
return False
|
| 128 |
+
else:
|
| 129 |
+
print("β
Table exists (empty table)")
|
| 130 |
+
return True
|
| 131 |
+
else:
|
| 132 |
+
# If rpc doesn't work, try direct SQL execution via PostgREST
|
| 133 |
+
# Some Supabase setups allow direct SQL execution
|
| 134 |
+
print("β οΈ RPC method not available, trying alternative method...")
|
| 135 |
+
print(" You may need to run the SQL manually in Supabase SQL Editor")
|
| 136 |
+
print(f" See: supabase_admin_rules_table.sql")
|
| 137 |
+
return False
|
| 138 |
+
|
| 139 |
+
except Exception as e:
|
| 140 |
+
print(f"β Error creating table: {e}")
|
| 141 |
+
print("\nπ‘ Alternative: Run the SQL manually in Supabase SQL Editor")
|
| 142 |
+
print(" 1. Go to Supabase Dashboard β SQL Editor")
|
| 143 |
+
print(" 2. Copy contents of supabase_admin_rules_table.sql")
|
| 144 |
+
print(" 3. Paste and run in SQL Editor")
|
| 145 |
+
return False
|
| 146 |
+
|
| 147 |
+
|
| 148 |
+
def verify_table_structure():
|
| 149 |
+
"""Verify the table structure by checking columns."""
|
| 150 |
+
supabase_url = os.getenv("SUPABASE_URL")
|
| 151 |
+
supabase_key = os.getenv("SUPABASE_SERVICE_KEY")
|
| 152 |
+
|
| 153 |
+
if not supabase_url or not supabase_key:
|
| 154 |
+
return False
|
| 155 |
+
|
| 156 |
+
try:
|
| 157 |
+
client = create_client(supabase_url, supabase_key)
|
| 158 |
+
|
| 159 |
+
# Try to get table info by querying with limit 0
|
| 160 |
+
result = client.table("admin_rules").select("*").limit(0).execute()
|
| 161 |
+
|
| 162 |
+
print("\nπ Table Structure Verified:")
|
| 163 |
+
print(" β
admin_rules table exists")
|
| 164 |
+
print(" β
Table is accessible")
|
| 165 |
+
return True
|
| 166 |
+
except Exception as e:
|
| 167 |
+
if "relation" in str(e).lower() or "does not exist" in str(e).lower():
|
| 168 |
+
print("β Table does not exist yet")
|
| 169 |
+
return False
|
| 170 |
+
else:
|
| 171 |
+
print(f"β οΈ Could not verify: {e}")
|
| 172 |
+
return False
|
| 173 |
+
|
| 174 |
+
|
| 175 |
+
if __name__ == "__main__":
|
| 176 |
+
print("=" * 60)
|
| 177 |
+
print("Supabase Admin Rules Table Creator")
|
| 178 |
+
print("=" * 60)
|
| 179 |
+
print()
|
| 180 |
+
|
| 181 |
+
# Load environment variables
|
| 182 |
+
from dotenv import load_dotenv
|
| 183 |
+
load_dotenv()
|
| 184 |
+
|
| 185 |
+
# Check if table already exists
|
| 186 |
+
print("π Checking if table already exists...")
|
| 187 |
+
if verify_table_structure():
|
| 188 |
+
print("\nβ
Table already exists! No need to create it.")
|
| 189 |
+
response = input("\nDo you want to recreate it? (y/N): ")
|
| 190 |
+
if response.lower() != 'y':
|
| 191 |
+
print("Exiting...")
|
| 192 |
+
sys.exit(0)
|
| 193 |
+
|
| 194 |
+
# Create the table
|
| 195 |
+
success = create_admin_rules_table()
|
| 196 |
+
|
| 197 |
+
if success:
|
| 198 |
+
print("\n" + "=" * 60)
|
| 199 |
+
print("β
Setup Complete!")
|
| 200 |
+
print("=" * 60)
|
| 201 |
+
print("\nYou can now use the RulesStore with Supabase.")
|
| 202 |
+
print("Rules will be automatically saved to Supabase instead of SQLite.")
|
| 203 |
+
else:
|
| 204 |
+
print("\n" + "=" * 60)
|
| 205 |
+
print("β οΈ Automatic setup failed")
|
| 206 |
+
print("=" * 60)
|
| 207 |
+
print("\nPlease run the SQL manually:")
|
| 208 |
+
print("1. Go to Supabase Dashboard β SQL Editor")
|
| 209 |
+
print("2. Open: supabase_admin_rules_table.sql")
|
| 210 |
+
print("3. Copy and paste the SQL into the editor")
|
| 211 |
+
print("4. Click 'Run' to execute")
|
| 212 |
+
|
backend/api/storage/rules_store.py
CHANGED
|
@@ -1,21 +1,210 @@
|
|
| 1 |
import sqlite3
|
| 2 |
import time
|
|
|
|
| 3 |
from pathlib import Path
|
| 4 |
from typing import List, Optional, Dict, Any
|
| 5 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 6 |
|
| 7 |
class RulesStore:
|
| 8 |
"""
|
| 9 |
-
|
| 10 |
-
|
| 11 |
"""
|
| 12 |
|
| 13 |
-
def __init__(self):
|
| 14 |
-
|
| 15 |
-
|
| 16 |
-
|
| 17 |
-
|
| 18 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 19 |
|
| 20 |
def _init_db(self):
|
| 21 |
with sqlite3.connect(self.db_path) as conn:
|
|
@@ -60,24 +249,63 @@ class RulesStore:
|
|
| 60 |
|
| 61 |
def get_rules(self, tenant_id: str) -> List[str]:
|
| 62 |
"""Get all rules as a list of rule text strings (backward compatibility)."""
|
| 63 |
-
|
| 64 |
-
|
| 65 |
-
|
| 66 |
-
|
| 67 |
-
|
| 68 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 69 |
|
| 70 |
def get_rules_detailed(self, tenant_id: str) -> List[Dict[str, Any]]:
|
| 71 |
"""Get all rules with full metadata including pattern, severity, etc."""
|
| 72 |
-
|
| 73 |
-
|
| 74 |
-
|
| 75 |
-
|
| 76 |
-
|
| 77 |
-
|
| 78 |
-
|
| 79 |
-
|
| 80 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 81 |
|
| 82 |
def add_rule(
|
| 83 |
self,
|
|
@@ -92,44 +320,103 @@ class RulesStore:
|
|
| 92 |
Add a rule with optional regex pattern and severity.
|
| 93 |
If pattern is None, the rule text itself is used as the pattern.
|
| 94 |
"""
|
| 95 |
-
|
| 96 |
-
|
| 97 |
-
|
| 98 |
-
|
| 99 |
-
|
| 100 |
-
|
| 101 |
-
|
| 102 |
-
|
| 103 |
-
|
| 104 |
-
|
| 105 |
-
|
| 106 |
-
|
| 107 |
-
|
| 108 |
-
|
| 109 |
-
|
| 110 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 111 |
|
| 112 |
def add_rules_bulk(self, tenant_id: str, rules: List[str]) -> List[str]:
|
| 113 |
added = []
|
| 114 |
-
|
| 115 |
-
|
| 116 |
-
|
| 117 |
-
|
| 118 |
-
|
| 119 |
-
|
| 120 |
-
|
| 121 |
-
|
| 122 |
-
|
| 123 |
-
|
| 124 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 125 |
return added
|
| 126 |
|
| 127 |
def delete_rule(self, tenant_id: str, rule: str) -> bool:
|
| 128 |
-
|
| 129 |
-
|
| 130 |
-
|
| 131 |
-
|
| 132 |
-
|
| 133 |
-
|
| 134 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 135 |
|
|
|
|
| 1 |
import sqlite3
|
| 2 |
import time
|
| 3 |
+
import os
|
| 4 |
from pathlib import Path
|
| 5 |
from typing import List, Optional, Dict, Any
|
| 6 |
|
| 7 |
+
# Try to import Supabase client
|
| 8 |
+
try:
|
| 9 |
+
from supabase import create_client, Client
|
| 10 |
+
SUPABASE_AVAILABLE = True
|
| 11 |
+
except ImportError:
|
| 12 |
+
SUPABASE_AVAILABLE = False
|
| 13 |
+
Client = None
|
| 14 |
+
|
| 15 |
|
| 16 |
class RulesStore:
|
| 17 |
"""
|
| 18 |
+
Store for admin rules with support for both Supabase and SQLite.
|
| 19 |
+
Uses Supabase if configured, otherwise falls back to SQLite.
|
| 20 |
"""
|
| 21 |
|
| 22 |
+
def __init__(self, use_supabase: Optional[bool] = None, auto_create_table: bool = False):
|
| 23 |
+
"""
|
| 24 |
+
Initialize RulesStore.
|
| 25 |
+
|
| 26 |
+
Args:
|
| 27 |
+
use_supabase: If True, use Supabase; if False, use SQLite.
|
| 28 |
+
If None, auto-detect based on environment variables.
|
| 29 |
+
auto_create_table: If True, attempt to create Supabase table if it doesn't exist.
|
| 30 |
+
Default False to avoid blocking startup. Use create_table() method separately.
|
| 31 |
+
"""
|
| 32 |
+
self.use_supabase = use_supabase
|
| 33 |
+
self._table_verified = False
|
| 34 |
+
|
| 35 |
+
# Auto-detect if Supabase should be used
|
| 36 |
+
if self.use_supabase is None:
|
| 37 |
+
supabase_url = os.getenv("SUPABASE_URL")
|
| 38 |
+
supabase_key = os.getenv("SUPABASE_SERVICE_KEY")
|
| 39 |
+
self.use_supabase = bool(supabase_url and supabase_key and SUPABASE_AVAILABLE)
|
| 40 |
+
|
| 41 |
+
if self.use_supabase:
|
| 42 |
+
# Initialize Supabase client
|
| 43 |
+
supabase_url = os.getenv("SUPABASE_URL")
|
| 44 |
+
supabase_key = os.getenv("SUPABASE_SERVICE_KEY")
|
| 45 |
+
if not supabase_url or not supabase_key:
|
| 46 |
+
# Don't raise error, fall back to SQLite instead
|
| 47 |
+
print("β οΈ Supabase credentials missing. Falling back to SQLite.")
|
| 48 |
+
self.use_supabase = False
|
| 49 |
+
else:
|
| 50 |
+
try:
|
| 51 |
+
self.supabase_client = create_client(supabase_url, supabase_key)
|
| 52 |
+
self.table_name = "admin_rules"
|
| 53 |
+
# Only verify table existence, don't create during init
|
| 54 |
+
if auto_create_table:
|
| 55 |
+
self._ensure_supabase_table()
|
| 56 |
+
else:
|
| 57 |
+
# Quick check without blocking
|
| 58 |
+
self._quick_table_check()
|
| 59 |
+
except Exception as e:
|
| 60 |
+
print(f"β οΈ Failed to initialize Supabase client: {e}. Falling back to SQLite.")
|
| 61 |
+
self.use_supabase = False
|
| 62 |
+
|
| 63 |
+
if not self.use_supabase:
|
| 64 |
+
# Initialize SQLite
|
| 65 |
+
root_dir = Path(__file__).resolve().parents[3] # points to project root
|
| 66 |
+
data_dir = root_dir / "data"
|
| 67 |
+
data_dir.mkdir(parents=True, exist_ok=True)
|
| 68 |
+
self.db_path = data_dir / "admin_rules.db"
|
| 69 |
+
self._init_db()
|
| 70 |
+
|
| 71 |
+
def _quick_table_check(self):
|
| 72 |
+
"""Quick non-blocking check if table exists."""
|
| 73 |
+
try:
|
| 74 |
+
self.supabase_client.table(self.table_name).select("id").limit(1).execute()
|
| 75 |
+
self._table_verified = True
|
| 76 |
+
except Exception:
|
| 77 |
+
# Table might not exist, but don't block startup
|
| 78 |
+
self._table_verified = False
|
| 79 |
+
|
| 80 |
+
def _ensure_supabase_table(self):
|
| 81 |
+
"""Ensure the Supabase table exists, create it if needed."""
|
| 82 |
+
try:
|
| 83 |
+
# Try to query the table to see if it exists
|
| 84 |
+
self.supabase_client.table(self.table_name).select("id").limit(1).execute()
|
| 85 |
+
# If we get here, table exists
|
| 86 |
+
self._table_verified = True
|
| 87 |
+
return True
|
| 88 |
+
except Exception as e:
|
| 89 |
+
error_str = str(e).lower()
|
| 90 |
+
if "relation" in error_str or "does not exist" in error_str or "not found" in error_str:
|
| 91 |
+
# Table doesn't exist, try to create it (non-blocking)
|
| 92 |
+
print(f"β οΈ Table '{self.table_name}' does not exist in Supabase.")
|
| 93 |
+
print(" Run 'python create_supabase_table.py' to create it, or create manually in Supabase SQL Editor.")
|
| 94 |
+
# Don't block startup - just log the issue
|
| 95 |
+
return False
|
| 96 |
+
else:
|
| 97 |
+
# Other error, assume table exists
|
| 98 |
+
self._table_verified = True
|
| 99 |
+
return True
|
| 100 |
+
|
| 101 |
+
def create_table_if_needed(self):
|
| 102 |
+
"""Public method to create table if needed. Can be called after startup."""
|
| 103 |
+
if not self.use_supabase:
|
| 104 |
+
return False
|
| 105 |
+
if self._table_verified:
|
| 106 |
+
return True
|
| 107 |
+
return self._create_supabase_table()
|
| 108 |
+
|
| 109 |
+
def _create_supabase_table(self) -> bool:
|
| 110 |
+
"""Create the admin_rules table in Supabase programmatically."""
|
| 111 |
+
try:
|
| 112 |
+
# Read SQL file
|
| 113 |
+
sql_file = Path(__file__).resolve().parents[3] / "supabase_admin_rules_table.sql"
|
| 114 |
+
if not sql_file.exists():
|
| 115 |
+
print(f" β οΈ SQL file not found: {sql_file}")
|
| 116 |
+
self._show_manual_instructions()
|
| 117 |
+
return False
|
| 118 |
+
|
| 119 |
+
with open(sql_file, "r", encoding="utf-8") as f:
|
| 120 |
+
sql_content = f.read()
|
| 121 |
+
|
| 122 |
+
# Method 1: Try using psql if POSTGRESQL_URL is available (non-blocking with timeout)
|
| 123 |
+
postgres_url = os.getenv("POSTGRESQL_URL")
|
| 124 |
+
if postgres_url:
|
| 125 |
+
try:
|
| 126 |
+
import subprocess
|
| 127 |
+
print(" π§ Attempting to create table via psql...")
|
| 128 |
+
|
| 129 |
+
# Execute SQL using psql with timeout
|
| 130 |
+
result = subprocess.run(
|
| 131 |
+
["psql", postgres_url, "-c", sql_content],
|
| 132 |
+
capture_output=True,
|
| 133 |
+
text=True,
|
| 134 |
+
timeout=10 # Shorter timeout to avoid blocking
|
| 135 |
+
)
|
| 136 |
+
|
| 137 |
+
if result.returncode == 0:
|
| 138 |
+
print(" β
Table created successfully via psql!")
|
| 139 |
+
self._table_verified = True
|
| 140 |
+
return True
|
| 141 |
+
else:
|
| 142 |
+
print(f" β οΈ psql returned error: {result.stderr[:200]}")
|
| 143 |
+
except FileNotFoundError:
|
| 144 |
+
print(" β οΈ psql not found in PATH")
|
| 145 |
+
except subprocess.TimeoutExpired:
|
| 146 |
+
print(" β οΈ psql timed out (table creation may still be in progress)")
|
| 147 |
+
except Exception as e:
|
| 148 |
+
print(f" β οΈ psql method failed: {e}")
|
| 149 |
+
|
| 150 |
+
# Method 2: Try using Supabase REST API (if custom function exists)
|
| 151 |
+
try:
|
| 152 |
+
import httpx
|
| 153 |
+
|
| 154 |
+
# Try different possible RPC endpoints
|
| 155 |
+
endpoints = [
|
| 156 |
+
"/rest/v1/rpc/exec_sql",
|
| 157 |
+
"/rest/v1/rpc/execute_sql",
|
| 158 |
+
"/rest/v1/rpc/run_sql"
|
| 159 |
+
]
|
| 160 |
+
|
| 161 |
+
for endpoint in endpoints:
|
| 162 |
+
try:
|
| 163 |
+
response = httpx.post(
|
| 164 |
+
f"{os.getenv('SUPABASE_URL')}{endpoint}",
|
| 165 |
+
headers={
|
| 166 |
+
"apikey": os.getenv("SUPABASE_SERVICE_KEY"),
|
| 167 |
+
"Authorization": f"Bearer {os.getenv('SUPABASE_SERVICE_KEY')}",
|
| 168 |
+
"Content-Type": "application/json",
|
| 169 |
+
"Prefer": "return=representation"
|
| 170 |
+
},
|
| 171 |
+
json={"query": sql_content, "sql": sql_content},
|
| 172 |
+
timeout=10 # Shorter timeout
|
| 173 |
+
)
|
| 174 |
+
|
| 175 |
+
if response.status_code in [200, 201, 204]:
|
| 176 |
+
print(" β
Table created successfully via API!")
|
| 177 |
+
self._table_verified = True
|
| 178 |
+
return True
|
| 179 |
+
except Exception:
|
| 180 |
+
continue
|
| 181 |
+
except ImportError:
|
| 182 |
+
pass
|
| 183 |
+
except Exception as e:
|
| 184 |
+
print(f" β οΈ API method failed: {e}")
|
| 185 |
+
|
| 186 |
+
# Method 3: Show manual instructions (don't block)
|
| 187 |
+
self._show_manual_instructions()
|
| 188 |
+
return False
|
| 189 |
+
|
| 190 |
+
except Exception as e:
|
| 191 |
+
print(f" β Error: {e}")
|
| 192 |
+
self._show_manual_instructions()
|
| 193 |
+
return False
|
| 194 |
+
|
| 195 |
+
def _show_manual_instructions(self):
|
| 196 |
+
"""Show instructions for manual table creation."""
|
| 197 |
+
sql_file = Path(__file__).resolve().parents[3] / "supabase_admin_rules_table.sql"
|
| 198 |
+
print("\n π Manual Setup Required:")
|
| 199 |
+
print(" 1. Go to: https://app.supabase.com β Your Project β SQL Editor")
|
| 200 |
+
print(" 2. Click 'New query'")
|
| 201 |
+
if sql_file.exists():
|
| 202 |
+
print(f" 3. Open file: {sql_file.name}")
|
| 203 |
+
print(" 4. Copy all SQL and paste into SQL Editor")
|
| 204 |
+
else:
|
| 205 |
+
print(" 3. Copy the SQL from supabase_admin_rules_table.sql")
|
| 206 |
+
print(" 5. Click 'Run' to execute")
|
| 207 |
+
print("\n π‘ After creating the table, restart your application.")
|
| 208 |
|
| 209 |
def _init_db(self):
|
| 210 |
with sqlite3.connect(self.db_path) as conn:
|
|
|
|
| 249 |
|
| 250 |
def get_rules(self, tenant_id: str) -> List[str]:
|
| 251 |
"""Get all rules as a list of rule text strings (backward compatibility)."""
|
| 252 |
+
if self.use_supabase:
|
| 253 |
+
try:
|
| 254 |
+
response = self.supabase_client.table(self.table_name)\
|
| 255 |
+
.select("rule")\
|
| 256 |
+
.eq("tenant_id", tenant_id)\
|
| 257 |
+
.eq("enabled", True)\
|
| 258 |
+
.order("id")\
|
| 259 |
+
.execute()
|
| 260 |
+
return [row["rule"] for row in response.data]
|
| 261 |
+
except Exception as e:
|
| 262 |
+
print(f"Error fetching rules from Supabase: {e}")
|
| 263 |
+
return []
|
| 264 |
+
else:
|
| 265 |
+
with sqlite3.connect(self.db_path) as conn:
|
| 266 |
+
cursor = conn.execute(
|
| 267 |
+
"SELECT rule FROM admin_rules WHERE tenant_id = ? AND enabled = 1 ORDER BY id ASC",
|
| 268 |
+
(tenant_id,),
|
| 269 |
+
)
|
| 270 |
+
return [row[0] for row in cursor.fetchall()]
|
| 271 |
|
| 272 |
def get_rules_detailed(self, tenant_id: str) -> List[Dict[str, Any]]:
|
| 273 |
"""Get all rules with full metadata including pattern, severity, etc."""
|
| 274 |
+
if self.use_supabase:
|
| 275 |
+
try:
|
| 276 |
+
response = self.supabase_client.table(self.table_name)\
|
| 277 |
+
.select("*")\
|
| 278 |
+
.eq("tenant_id", tenant_id)\
|
| 279 |
+
.eq("enabled", True)\
|
| 280 |
+
.order("id")\
|
| 281 |
+
.execute()
|
| 282 |
+
# Convert to list of dicts and ensure created_at is a timestamp
|
| 283 |
+
rules = []
|
| 284 |
+
for row in response.data:
|
| 285 |
+
rule_dict = dict(row)
|
| 286 |
+
# Convert created_at to Unix timestamp if it's a string
|
| 287 |
+
if "created_at" in rule_dict and isinstance(rule_dict["created_at"], str):
|
| 288 |
+
try:
|
| 289 |
+
from datetime import datetime
|
| 290 |
+
dt = datetime.fromisoformat(rule_dict["created_at"].replace("Z", "+00:00"))
|
| 291 |
+
rule_dict["created_at"] = int(dt.timestamp())
|
| 292 |
+
except:
|
| 293 |
+
rule_dict["created_at"] = int(time.time())
|
| 294 |
+
rules.append(rule_dict)
|
| 295 |
+
return rules
|
| 296 |
+
except Exception as e:
|
| 297 |
+
print(f"Error fetching detailed rules from Supabase: {e}")
|
| 298 |
+
return []
|
| 299 |
+
else:
|
| 300 |
+
with sqlite3.connect(self.db_path) as conn:
|
| 301 |
+
conn.row_factory = sqlite3.Row
|
| 302 |
+
cursor = conn.execute(
|
| 303 |
+
"""SELECT id, tenant_id, rule, pattern, severity, description, enabled, created_at
|
| 304 |
+
FROM admin_rules WHERE tenant_id = ? AND enabled = 1 ORDER BY id ASC""",
|
| 305 |
+
(tenant_id,),
|
| 306 |
+
)
|
| 307 |
+
rows = cursor.fetchall()
|
| 308 |
+
return [dict(row) for row in rows]
|
| 309 |
|
| 310 |
def add_rule(
|
| 311 |
self,
|
|
|
|
| 320 |
Add a rule with optional regex pattern and severity.
|
| 321 |
If pattern is None, the rule text itself is used as the pattern.
|
| 322 |
"""
|
| 323 |
+
# If pattern not provided, use rule text as pattern
|
| 324 |
+
pattern_value = pattern or rule
|
| 325 |
+
description_value = description or rule
|
| 326 |
+
|
| 327 |
+
if self.use_supabase:
|
| 328 |
+
try:
|
| 329 |
+
# Use upsert to handle unique constraint
|
| 330 |
+
data = {
|
| 331 |
+
"tenant_id": tenant_id,
|
| 332 |
+
"rule": rule,
|
| 333 |
+
"pattern": pattern_value,
|
| 334 |
+
"severity": severity,
|
| 335 |
+
"description": description_value,
|
| 336 |
+
"enabled": enabled
|
| 337 |
+
}
|
| 338 |
+
# Supabase upsert - will insert or update based on unique constraint
|
| 339 |
+
response = self.supabase_client.table(self.table_name)\
|
| 340 |
+
.upsert(data)\
|
| 341 |
+
.execute()
|
| 342 |
+
return True
|
| 343 |
+
except Exception as e:
|
| 344 |
+
print(f"Error adding rule to Supabase: {e}")
|
| 345 |
+
return False
|
| 346 |
+
else:
|
| 347 |
+
try:
|
| 348 |
+
with sqlite3.connect(self.db_path) as conn:
|
| 349 |
+
conn.execute(
|
| 350 |
+
"""INSERT OR IGNORE INTO admin_rules
|
| 351 |
+
(tenant_id, rule, pattern, severity, description, enabled, created_at)
|
| 352 |
+
VALUES (?, ?, ?, ?, ?, ?, ?)""",
|
| 353 |
+
(tenant_id, rule, pattern_value, severity, description_value, 1 if enabled else 0, int(time.time())),
|
| 354 |
+
)
|
| 355 |
+
conn.commit()
|
| 356 |
+
return True
|
| 357 |
+
except sqlite3.Error:
|
| 358 |
+
return False
|
| 359 |
|
| 360 |
def add_rules_bulk(self, tenant_id: str, rules: List[str]) -> List[str]:
|
| 361 |
added = []
|
| 362 |
+
if self.use_supabase:
|
| 363 |
+
try:
|
| 364 |
+
# Prepare bulk data
|
| 365 |
+
bulk_data = [
|
| 366 |
+
{
|
| 367 |
+
"tenant_id": tenant_id,
|
| 368 |
+
"rule": rule,
|
| 369 |
+
"pattern": rule, # Use rule text as pattern
|
| 370 |
+
"severity": "medium",
|
| 371 |
+
"description": rule,
|
| 372 |
+
"enabled": True
|
| 373 |
+
}
|
| 374 |
+
for rule in rules
|
| 375 |
+
]
|
| 376 |
+
# Upsert all rules at once
|
| 377 |
+
response = self.supabase_client.table(self.table_name)\
|
| 378 |
+
.upsert(bulk_data)\
|
| 379 |
+
.execute()
|
| 380 |
+
added = rules # All rules were attempted
|
| 381 |
+
except Exception as e:
|
| 382 |
+
print(f"Error adding bulk rules to Supabase: {e}")
|
| 383 |
+
# Fallback: try one by one
|
| 384 |
+
for rule in rules:
|
| 385 |
+
if self.add_rule(tenant_id, rule):
|
| 386 |
+
added.append(rule)
|
| 387 |
+
else:
|
| 388 |
+
with sqlite3.connect(self.db_path) as conn:
|
| 389 |
+
for rule in rules:
|
| 390 |
+
try:
|
| 391 |
+
conn.execute(
|
| 392 |
+
"INSERT OR IGNORE INTO admin_rules (tenant_id, rule) VALUES (?, ?)",
|
| 393 |
+
(tenant_id, rule),
|
| 394 |
+
)
|
| 395 |
+
added.append(rule)
|
| 396 |
+
except sqlite3.Error:
|
| 397 |
+
continue
|
| 398 |
+
conn.commit()
|
| 399 |
return added
|
| 400 |
|
| 401 |
def delete_rule(self, tenant_id: str, rule: str) -> bool:
|
| 402 |
+
if self.use_supabase:
|
| 403 |
+
try:
|
| 404 |
+
response = self.supabase_client.table(self.table_name)\
|
| 405 |
+
.delete()\
|
| 406 |
+
.eq("tenant_id", tenant_id)\
|
| 407 |
+
.eq("rule", rule)\
|
| 408 |
+
.execute()
|
| 409 |
+
# Check if any rows were deleted
|
| 410 |
+
return len(response.data) > 0 if response.data else False
|
| 411 |
+
except Exception as e:
|
| 412 |
+
print(f"Error deleting rule from Supabase: {e}")
|
| 413 |
+
return False
|
| 414 |
+
else:
|
| 415 |
+
with sqlite3.connect(self.db_path) as conn:
|
| 416 |
+
cursor = conn.execute(
|
| 417 |
+
"DELETE FROM admin_rules WHERE tenant_id = ? AND rule = ?",
|
| 418 |
+
(tenant_id, rule),
|
| 419 |
+
)
|
| 420 |
+
conn.commit()
|
| 421 |
+
return cursor.rowcount > 0
|
| 422 |
|
check_rules_db.py
ADDED
|
@@ -0,0 +1,43 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
"""
|
| 2 |
+
Quick script to check if admin rules are saved in the database
|
| 3 |
+
"""
|
| 4 |
+
import sqlite3
|
| 5 |
+
from pathlib import Path
|
| 6 |
+
|
| 7 |
+
db_path = Path("data/admin_rules.db")
|
| 8 |
+
|
| 9 |
+
if db_path.exists():
|
| 10 |
+
print(f"β
Database found at: {db_path}")
|
| 11 |
+
print("\n" + "="*60)
|
| 12 |
+
|
| 13 |
+
conn = sqlite3.connect(db_path)
|
| 14 |
+
conn.row_factory = sqlite3.Row
|
| 15 |
+
cursor = conn.cursor()
|
| 16 |
+
|
| 17 |
+
# Get all rules
|
| 18 |
+
cursor.execute("SELECT * FROM admin_rules ORDER BY created_at DESC")
|
| 19 |
+
rules = cursor.fetchall()
|
| 20 |
+
|
| 21 |
+
if rules:
|
| 22 |
+
print(f"π Found {len(rules)} rule(s) in database:\n")
|
| 23 |
+
for rule in rules:
|
| 24 |
+
print(f"Tenant: {rule['tenant_id']}")
|
| 25 |
+
print(f"Rule: {rule['rule']}")
|
| 26 |
+
print(f"Pattern: {rule['pattern'] or 'N/A'}")
|
| 27 |
+
print(f"Severity: {rule['severity']}")
|
| 28 |
+
print(f"Enabled: {rule['enabled']}")
|
| 29 |
+
print(f"Created: {rule['created_at']}")
|
| 30 |
+
print("-" * 60)
|
| 31 |
+
else:
|
| 32 |
+
print("β οΈ No rules found in database.")
|
| 33 |
+
print(" Add rules via the Gradio UI or API to populate the database.")
|
| 34 |
+
|
| 35 |
+
conn.close()
|
| 36 |
+
else:
|
| 37 |
+
print(f"β Database not found at: {db_path}")
|
| 38 |
+
print(" The database will be created automatically when you add your first rule.")
|
| 39 |
+
print("\nπ‘ To add rules:")
|
| 40 |
+
print(" 1. Open Gradio UI (python app.py)")
|
| 41 |
+
print(" 2. Go to 'Admin Rules & Compliance' tab")
|
| 42 |
+
print(" 3. Add rules in the text box and click 'Upload / Append Rules'")
|
| 43 |
+
|
create_supabase_table.py
ADDED
|
@@ -0,0 +1,185 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
"""
|
| 2 |
+
Create admin_rules table in Supabase programmatically.
|
| 3 |
+
This script uses the Supabase Python client to set up the table.
|
| 4 |
+
"""
|
| 5 |
+
|
| 6 |
+
import os
|
| 7 |
+
import sys
|
| 8 |
+
from pathlib import Path
|
| 9 |
+
from dotenv import load_dotenv
|
| 10 |
+
|
| 11 |
+
load_dotenv()
|
| 12 |
+
|
| 13 |
+
def create_table_using_supabase_client():
|
| 14 |
+
"""
|
| 15 |
+
Create the admin_rules table using Supabase client.
|
| 16 |
+
Since Supabase doesn't allow direct SQL execution via REST API,
|
| 17 |
+
we'll use a workaround or provide clear instructions.
|
| 18 |
+
"""
|
| 19 |
+
supabase_url = os.getenv("SUPABASE_URL")
|
| 20 |
+
supabase_key = os.getenv("SUPABASE_SERVICE_KEY")
|
| 21 |
+
|
| 22 |
+
if not supabase_url or not supabase_key:
|
| 23 |
+
print("β Missing Supabase credentials!")
|
| 24 |
+
print(" Set SUPABASE_URL and SUPABASE_SERVICE_KEY in .env file")
|
| 25 |
+
return False
|
| 26 |
+
|
| 27 |
+
try:
|
| 28 |
+
from supabase import create_client
|
| 29 |
+
import httpx
|
| 30 |
+
|
| 31 |
+
print("π Connecting to Supabase...")
|
| 32 |
+
client = create_client(supabase_url, supabase_key)
|
| 33 |
+
|
| 34 |
+
# Read SQL from file
|
| 35 |
+
sql_file = Path(__file__).parent / "supabase_admin_rules_table.sql"
|
| 36 |
+
if not sql_file.exists():
|
| 37 |
+
print(f"β SQL file not found: {sql_file}")
|
| 38 |
+
return False
|
| 39 |
+
|
| 40 |
+
with open(sql_file, "r", encoding="utf-8") as f:
|
| 41 |
+
sql_content = f.read()
|
| 42 |
+
|
| 43 |
+
print("π Attempting to create table via Supabase API...")
|
| 44 |
+
|
| 45 |
+
# Method 1: Try using Supabase Management API (if available)
|
| 46 |
+
# This requires the project to have pg_net extension enabled
|
| 47 |
+
try:
|
| 48 |
+
# Use the REST API to execute SQL via a custom function
|
| 49 |
+
# First, check if we can use the SQL execution endpoint
|
| 50 |
+
response = httpx.post(
|
| 51 |
+
f"{supabase_url}/rest/v1/rpc/exec_sql",
|
| 52 |
+
headers={
|
| 53 |
+
"apikey": supabase_key,
|
| 54 |
+
"Authorization": f"Bearer {supabase_key}",
|
| 55 |
+
"Content-Type": "application/json",
|
| 56 |
+
"Prefer": "return=representation"
|
| 57 |
+
},
|
| 58 |
+
json={"query": sql_content},
|
| 59 |
+
timeout=30
|
| 60 |
+
)
|
| 61 |
+
|
| 62 |
+
if response.status_code in [200, 201, 204]:
|
| 63 |
+
print("β
Table created successfully via API!")
|
| 64 |
+
return True
|
| 65 |
+
else:
|
| 66 |
+
print(f"β οΈ API method returned: {response.status_code}")
|
| 67 |
+
print(f" Response: {response.text[:200]}")
|
| 68 |
+
except Exception as e:
|
| 69 |
+
print(f"β οΈ API method failed: {e}")
|
| 70 |
+
|
| 71 |
+
# Method 2: Try using Supabase Python client's table operations
|
| 72 |
+
# This won't work for DDL, but we can verify if table exists
|
| 73 |
+
print("\nπ Checking if table already exists...")
|
| 74 |
+
try:
|
| 75 |
+
result = client.table("admin_rules").select("id").limit(1).execute()
|
| 76 |
+
print("β
Table 'admin_rules' already exists!")
|
| 77 |
+
return True
|
| 78 |
+
except Exception as e:
|
| 79 |
+
error_str = str(e).lower()
|
| 80 |
+
if "relation" in error_str or "does not exist" in error_str:
|
| 81 |
+
print("β οΈ Table does not exist yet.")
|
| 82 |
+
else:
|
| 83 |
+
print(f"β οΈ Error checking table: {e}")
|
| 84 |
+
|
| 85 |
+
# Method 3: Since direct SQL execution isn't supported, show instructions
|
| 86 |
+
print("\n" + "=" * 70)
|
| 87 |
+
print("π MANUAL SETUP REQUIRED")
|
| 88 |
+
print("=" * 70)
|
| 89 |
+
print("\nSupabase doesn't allow programmatic SQL execution for security.")
|
| 90 |
+
print("Please run the SQL manually in Supabase Dashboard:\n")
|
| 91 |
+
print("1. Go to: https://app.supabase.com")
|
| 92 |
+
print("2. Select your project")
|
| 93 |
+
print("3. Click 'SQL Editor' (left sidebar)")
|
| 94 |
+
print("4. Click 'New query'")
|
| 95 |
+
print("5. Copy the SQL below and paste it:")
|
| 96 |
+
print("\n" + "-" * 70)
|
| 97 |
+
print(sql_content)
|
| 98 |
+
print("-" * 70)
|
| 99 |
+
print("\n6. Click 'Run' button (or press Ctrl+Enter)")
|
| 100 |
+
print("7. Wait for success confirmation")
|
| 101 |
+
print("\nβ
After running, the table will be created automatically!")
|
| 102 |
+
|
| 103 |
+
return False
|
| 104 |
+
|
| 105 |
+
except ImportError:
|
| 106 |
+
print("β Supabase client not installed")
|
| 107 |
+
print(" Run: pip install supabase")
|
| 108 |
+
return False
|
| 109 |
+
except Exception as e:
|
| 110 |
+
print(f"β Error: {e}")
|
| 111 |
+
import traceback
|
| 112 |
+
traceback.print_exc()
|
| 113 |
+
return False
|
| 114 |
+
|
| 115 |
+
|
| 116 |
+
def create_table_via_psql():
|
| 117 |
+
"""
|
| 118 |
+
Alternative: Use psql (PostgreSQL client) to execute SQL directly.
|
| 119 |
+
This requires POSTGRESQL_URL to be set.
|
| 120 |
+
"""
|
| 121 |
+
postgres_url = os.getenv("POSTGRESQL_URL")
|
| 122 |
+
if not postgres_url:
|
| 123 |
+
print("β οΈ POSTGRESQL_URL not set, skipping psql method")
|
| 124 |
+
return False
|
| 125 |
+
|
| 126 |
+
sql_file = Path(__file__).parent / "supabase_admin_rules_table.sql"
|
| 127 |
+
if not sql_file.exists():
|
| 128 |
+
return False
|
| 129 |
+
|
| 130 |
+
try:
|
| 131 |
+
import subprocess
|
| 132 |
+
print("π Attempting to create table via psql...")
|
| 133 |
+
|
| 134 |
+
# Execute SQL using psql
|
| 135 |
+
result = subprocess.run(
|
| 136 |
+
["psql", postgres_url, "-f", str(sql_file)],
|
| 137 |
+
capture_output=True,
|
| 138 |
+
text=True,
|
| 139 |
+
timeout=30
|
| 140 |
+
)
|
| 141 |
+
|
| 142 |
+
if result.returncode == 0:
|
| 143 |
+
print("β
Table created successfully via psql!")
|
| 144 |
+
return True
|
| 145 |
+
else:
|
| 146 |
+
print(f"β οΈ psql failed: {result.stderr}")
|
| 147 |
+
return False
|
| 148 |
+
except FileNotFoundError:
|
| 149 |
+
print("β οΈ psql not found in PATH")
|
| 150 |
+
return False
|
| 151 |
+
except Exception as e:
|
| 152 |
+
print(f"β οΈ psql method failed: {e}")
|
| 153 |
+
return False
|
| 154 |
+
|
| 155 |
+
|
| 156 |
+
if __name__ == "__main__":
|
| 157 |
+
print("=" * 70)
|
| 158 |
+
print("Supabase Admin Rules Table Creator")
|
| 159 |
+
print("=" * 70)
|
| 160 |
+
print()
|
| 161 |
+
|
| 162 |
+
# Try Method 1: Supabase client
|
| 163 |
+
success = create_table_using_supabase_client()
|
| 164 |
+
|
| 165 |
+
if not success:
|
| 166 |
+
# Try Method 2: psql (if available)
|
| 167 |
+
print("\n" + "=" * 70)
|
| 168 |
+
print("Trying alternative method: psql")
|
| 169 |
+
print("=" * 70)
|
| 170 |
+
success = create_table_via_psql()
|
| 171 |
+
|
| 172 |
+
if success:
|
| 173 |
+
print("\n" + "=" * 70)
|
| 174 |
+
print("β
SUCCESS!")
|
| 175 |
+
print("=" * 70)
|
| 176 |
+
print("\nThe admin_rules table has been created in Supabase.")
|
| 177 |
+
print("RulesStore will now use Supabase instead of SQLite.")
|
| 178 |
+
else:
|
| 179 |
+
print("\n" + "=" * 70)
|
| 180 |
+
print("π Manual Setup Required")
|
| 181 |
+
print("=" * 70)
|
| 182 |
+
print("\nPlease run the SQL manually in Supabase SQL Editor.")
|
| 183 |
+
print("The SQL script is ready in: supabase_admin_rules_table.sql")
|
| 184 |
+
print("\nAfter creating the table, RulesStore will automatically use Supabase.")
|
| 185 |
+
|
create_supabase_table_simple.py
ADDED
|
@@ -0,0 +1,70 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
"""
|
| 2 |
+
Simple script to create admin_rules table in Supabase.
|
| 3 |
+
This uses the Supabase Management API or direct SQL execution.
|
| 4 |
+
"""
|
| 5 |
+
|
| 6 |
+
import os
|
| 7 |
+
from dotenv import load_dotenv
|
| 8 |
+
import httpx
|
| 9 |
+
import json
|
| 10 |
+
|
| 11 |
+
load_dotenv()
|
| 12 |
+
|
| 13 |
+
SUPABASE_URL = os.getenv("SUPABASE_URL")
|
| 14 |
+
SUPABASE_SERVICE_KEY = os.getenv("SUPABASE_SERVICE_KEY")
|
| 15 |
+
|
| 16 |
+
if not SUPABASE_URL or not SUPABASE_SERVICE_KEY:
|
| 17 |
+
print("β Missing Supabase credentials!")
|
| 18 |
+
print(" Set SUPABASE_URL and SUPABASE_SERVICE_KEY in .env file")
|
| 19 |
+
exit(1)
|
| 20 |
+
|
| 21 |
+
# Read the SQL file
|
| 22 |
+
sql_file = Path("supabase_admin_rules_table.sql")
|
| 23 |
+
if not sql_file.exists():
|
| 24 |
+
print(f"β SQL file not found: {sql_file}")
|
| 25 |
+
exit(1)
|
| 26 |
+
|
| 27 |
+
with open(sql_file, "r") as f:
|
| 28 |
+
sql_content = f.read()
|
| 29 |
+
|
| 30 |
+
print("π Connecting to Supabase...")
|
| 31 |
+
print(f" URL: {SUPABASE_URL[:50]}...")
|
| 32 |
+
|
| 33 |
+
# Method 1: Try using Supabase REST API with SQL execution
|
| 34 |
+
# Note: This requires the pg_net extension or a custom function
|
| 35 |
+
# Most Supabase projects don't allow direct SQL execution via REST API
|
| 36 |
+
|
| 37 |
+
# Method 2: Use Supabase Python client to execute via RPC
|
| 38 |
+
try:
|
| 39 |
+
from supabase import create_client
|
| 40 |
+
|
| 41 |
+
client = create_client(SUPABASE_URL, SUPABASE_SERVICE_KEY)
|
| 42 |
+
|
| 43 |
+
# Split SQL into individual statements
|
| 44 |
+
statements = [s.strip() for s in sql_content.split(";") if s.strip() and not s.strip().startswith("--")]
|
| 45 |
+
|
| 46 |
+
print(f"\nπ Executing {len(statements)} SQL statements...")
|
| 47 |
+
|
| 48 |
+
# Execute each statement
|
| 49 |
+
# Note: Supabase Python client doesn't support direct SQL execution
|
| 50 |
+
# We'll need to use a workaround or manual execution
|
| 51 |
+
|
| 52 |
+
print("\nβ οΈ Direct SQL execution via Python client is not supported.")
|
| 53 |
+
print(" Supabase requires SQL to be executed via the SQL Editor.")
|
| 54 |
+
print("\nπ Please follow these steps:")
|
| 55 |
+
print(" 1. Go to: https://app.supabase.com")
|
| 56 |
+
print(" 2. Select your project")
|
| 57 |
+
print(" 3. Click 'SQL Editor' in the left sidebar")
|
| 58 |
+
print(" 4. Click 'New query'")
|
| 59 |
+
print(" 5. Copy the contents of: supabase_admin_rules_table.sql")
|
| 60 |
+
print(" 6. Paste into the SQL Editor")
|
| 61 |
+
print(" 7. Click 'Run' (or press Ctrl+Enter)")
|
| 62 |
+
print("\nβ
After running the SQL, the table will be created!")
|
| 63 |
+
|
| 64 |
+
except ImportError:
|
| 65 |
+
print("β Supabase client not installed")
|
| 66 |
+
print(" Run: pip install supabase")
|
| 67 |
+
except Exception as e:
|
| 68 |
+
print(f"β Error: {e}")
|
| 69 |
+
print("\nπ‘ Manual setup required - see instructions above")
|
| 70 |
+
|
data/admin_rules.db
CHANGED
|
Binary files a/data/admin_rules.db and b/data/admin_rules.db differ
|
|
|
data/analytics.db
CHANGED
|
Binary files a/data/analytics.db and b/data/analytics.db differ
|
|
|
example_rules.txt
ADDED
|
@@ -0,0 +1,133 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Admin Rules Examples for IntegraChat
|
| 2 |
+
# Copy and paste these rules into the Admin Rules & Compliance tab in Gradio UI
|
| 3 |
+
|
| 4 |
+
# ============================================================
|
| 5 |
+
# HIGH PRIORITY SECURITY RULES
|
| 6 |
+
# ============================================================
|
| 7 |
+
|
| 8 |
+
Block password disclosure requests
|
| 9 |
+
Prevent sharing of authentication credentials
|
| 10 |
+
No sharing of API keys or tokens
|
| 11 |
+
Block requests for user account passwords
|
| 12 |
+
Prevent disclosure of security credentials
|
| 13 |
+
Block social security number requests
|
| 14 |
+
No sharing of credit card information
|
| 15 |
+
Prevent disclosure of personal identification numbers
|
| 16 |
+
Block requests for bank account details
|
| 17 |
+
No sharing of confidential access codes
|
| 18 |
+
|
| 19 |
+
# ============================================================
|
| 20 |
+
# MEDIUM PRIORITY COMPLIANCE RULES
|
| 21 |
+
# ============================================================
|
| 22 |
+
|
| 23 |
+
Block requests for employee personal information
|
| 24 |
+
Prevent sharing of customer data without authorization
|
| 25 |
+
No unauthorized access to financial records
|
| 26 |
+
Block requests for confidential business strategies
|
| 27 |
+
Prevent disclosure of proprietary information
|
| 28 |
+
No sharing of trade secrets
|
| 29 |
+
Block requests for competitor analysis data
|
| 30 |
+
Prevent unauthorized data export
|
| 31 |
+
No sharing of internal process documentation
|
| 32 |
+
Block requests for customer contact lists
|
| 33 |
+
|
| 34 |
+
# ============================================================
|
| 35 |
+
# DATA PRIVACY RULES
|
| 36 |
+
# ============================================================
|
| 37 |
+
|
| 38 |
+
Block requests for personal data of EU citizens
|
| 39 |
+
Prevent sharing of health information
|
| 40 |
+
No disclosure of medical records
|
| 41 |
+
Block requests for biometric data
|
| 42 |
+
Prevent sharing of location tracking information
|
| 43 |
+
No disclosure of children's personal information
|
| 44 |
+
Block requests for genetic information
|
| 45 |
+
Prevent sharing of religious or political affiliations
|
| 46 |
+
No disclosure of sexual orientation data
|
| 47 |
+
Block requests for financial transaction history
|
| 48 |
+
|
| 49 |
+
# ============================================================
|
| 50 |
+
# OPERATIONAL RULES
|
| 51 |
+
# ============================================================
|
| 52 |
+
|
| 53 |
+
Block requests to delete system logs
|
| 54 |
+
Prevent unauthorized system configuration changes
|
| 55 |
+
No sharing of infrastructure credentials
|
| 56 |
+
Block requests for production database access
|
| 57 |
+
Prevent disclosure of deployment procedures
|
| 58 |
+
No sharing of monitoring tool credentials
|
| 59 |
+
Block requests for backup restoration procedures
|
| 60 |
+
Prevent unauthorized access to cloud resources
|
| 61 |
+
No sharing of encryption keys
|
| 62 |
+
Block requests for system administrator privileges
|
| 63 |
+
|
| 64 |
+
# ============================================================
|
| 65 |
+
# CONTENT MODERATION RULES
|
| 66 |
+
# ============================================================
|
| 67 |
+
|
| 68 |
+
Block requests for generating harmful content
|
| 69 |
+
Prevent creation of offensive material
|
| 70 |
+
No sharing of inappropriate content
|
| 71 |
+
Block requests for generating misleading information
|
| 72 |
+
Prevent creation of fake news content
|
| 73 |
+
No sharing of defamatory statements
|
| 74 |
+
Block requests for generating hate speech
|
| 75 |
+
Prevent creation of discriminatory content
|
| 76 |
+
No sharing of violent content
|
| 77 |
+
Block requests for generating illegal content
|
| 78 |
+
|
| 79 |
+
# ============================================================
|
| 80 |
+
# SPECIFIC KEYWORD-BASED RULES
|
| 81 |
+
# ============================================================
|
| 82 |
+
|
| 83 |
+
Block queries containing "password" and "reset"
|
| 84 |
+
Prevent requests with "API key" and "generate"
|
| 85 |
+
No queries containing "SSN" or "social security"
|
| 86 |
+
Block requests with "credit card" and "number"
|
| 87 |
+
Prevent queries containing "bank account" and "details"
|
| 88 |
+
No requests with "admin" and "access"
|
| 89 |
+
Block queries containing "delete" and "all data"
|
| 90 |
+
Prevent requests with "export" and "customer list"
|
| 91 |
+
No queries containing "encryption key" and "show"
|
| 92 |
+
Block requests with "root password" and "share"
|
| 93 |
+
|
| 94 |
+
# ============================================================
|
| 95 |
+
# REGULATORY COMPLIANCE RULES
|
| 96 |
+
# ============================================================
|
| 97 |
+
|
| 98 |
+
Block requests violating GDPR regulations
|
| 99 |
+
Prevent sharing of data without consent
|
| 100 |
+
No disclosure of information to unauthorized parties
|
| 101 |
+
Block requests for data subject to HIPAA
|
| 102 |
+
Prevent sharing of protected health information
|
| 103 |
+
No disclosure of financial data subject to PCI-DSS
|
| 104 |
+
Block requests violating SOX compliance
|
| 105 |
+
Prevent sharing of audit trail information
|
| 106 |
+
No disclosure of information subject to FERPA
|
| 107 |
+
Block requests violating industry-specific regulations
|
| 108 |
+
|
| 109 |
+
# ============================================================
|
| 110 |
+
# RESPONSE BEHAVIOR RULES
|
| 111 |
+
# ============================================================
|
| 112 |
+
|
| 113 |
+
Keep greeting responses brief and simple
|
| 114 |
+
Do not provide verbose responses to simple greetings
|
| 115 |
+
Respond to hello and hi with short friendly greetings only
|
| 116 |
+
Avoid mentioning RAG or documentation sources in greeting responses
|
| 117 |
+
Keep casual conversation responses concise
|
| 118 |
+
|
| 119 |
+
# ============================================================
|
| 120 |
+
# CUSTOM BUSINESS RULES (Examples)
|
| 121 |
+
# ============================================================
|
| 122 |
+
|
| 123 |
+
Block requests for competitor pricing information
|
| 124 |
+
Prevent sharing of upcoming product launch details
|
| 125 |
+
No disclosure of merger and acquisition information
|
| 126 |
+
Block requests for employee salary information
|
| 127 |
+
Prevent sharing of vendor contract terms
|
| 128 |
+
No disclosure of strategic partnership details
|
| 129 |
+
Block requests for customer churn analysis data
|
| 130 |
+
Prevent sharing of marketing campaign strategies
|
| 131 |
+
No disclosure of research and development projects
|
| 132 |
+
Block requests for intellectual property information
|
| 133 |
+
|
example_rules_detailed.json
ADDED
|
@@ -0,0 +1,131 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"rules": [
|
| 3 |
+
{
|
| 4 |
+
"rule": "Block password disclosure requests",
|
| 5 |
+
"pattern": ".*(password|pwd|passcode|credential|login).*",
|
| 6 |
+
"severity": "high",
|
| 7 |
+
"description": "Prevents users from requesting or sharing passwords, credentials, or authentication information"
|
| 8 |
+
},
|
| 9 |
+
{
|
| 10 |
+
"rule": "Prevent sharing of API keys or tokens",
|
| 11 |
+
"pattern": ".*(api.?key|token|secret|access.?key|auth.?token).*",
|
| 12 |
+
"severity": "critical",
|
| 13 |
+
"description": "Blocks requests to share, generate, or disclose API keys, tokens, or authentication secrets"
|
| 14 |
+
},
|
| 15 |
+
{
|
| 16 |
+
"rule": "Block social security number requests",
|
| 17 |
+
"pattern": ".*(ssn|social.?security|tax.?id|ein).*",
|
| 18 |
+
"severity": "high",
|
| 19 |
+
"description": "Prevents disclosure of social security numbers or tax identification numbers"
|
| 20 |
+
},
|
| 21 |
+
{
|
| 22 |
+
"rule": "No sharing of credit card information",
|
| 23 |
+
"pattern": ".*(credit.?card|card.?number|cvv|cvc|expiration).*",
|
| 24 |
+
"severity": "critical",
|
| 25 |
+
"description": "Blocks requests to share or store credit card numbers, CVV codes, or payment card information"
|
| 26 |
+
},
|
| 27 |
+
{
|
| 28 |
+
"rule": "Block requests for bank account details",
|
| 29 |
+
"pattern": ".*(bank.?account|routing.?number|account.?number|swift|iban).*",
|
| 30 |
+
"severity": "high",
|
| 31 |
+
"description": "Prevents disclosure of bank account numbers, routing numbers, or financial account information"
|
| 32 |
+
},
|
| 33 |
+
{
|
| 34 |
+
"rule": "Prevent sharing of employee personal information",
|
| 35 |
+
"pattern": ".*(employee.?data|staff.?info|personnel.?record|hr.?data).*",
|
| 36 |
+
"severity": "medium",
|
| 37 |
+
"description": "Blocks requests to access or share employee personal information without authorization"
|
| 38 |
+
},
|
| 39 |
+
{
|
| 40 |
+
"rule": "No unauthorized access to financial records",
|
| 41 |
+
"pattern": ".*(financial.?record|accounting|bookkeeping|financial.?data).*",
|
| 42 |
+
"severity": "high",
|
| 43 |
+
"description": "Prevents unauthorized access to financial records, accounting data, or bookkeeping information"
|
| 44 |
+
},
|
| 45 |
+
{
|
| 46 |
+
"rule": "Block requests for confidential business strategies",
|
| 47 |
+
"pattern": ".*(business.?strategy|strategic.?plan|confidential.?plan|roadmap).*",
|
| 48 |
+
"severity": "medium",
|
| 49 |
+
"description": "Prevents disclosure of confidential business strategies, plans, or roadmaps"
|
| 50 |
+
},
|
| 51 |
+
{
|
| 52 |
+
"rule": "Prevent disclosure of proprietary information",
|
| 53 |
+
"pattern": ".*(proprietary|trade.?secret|intellectual.?property|ip).*",
|
| 54 |
+
"severity": "high",
|
| 55 |
+
"description": "Blocks requests to share proprietary information, trade secrets, or intellectual property"
|
| 56 |
+
},
|
| 57 |
+
{
|
| 58 |
+
"rule": "Block requests for personal data of EU citizens",
|
| 59 |
+
"pattern": ".*(gdpr|eu.?citizen|personal.?data|data.?subject).*",
|
| 60 |
+
"severity": "critical",
|
| 61 |
+
"description": "Prevents unauthorized access to personal data of EU citizens, violating GDPR regulations"
|
| 62 |
+
},
|
| 63 |
+
{
|
| 64 |
+
"rule": "Prevent sharing of health information",
|
| 65 |
+
"pattern": ".*(health.?info|medical.?record|patient.?data|hipaa).*",
|
| 66 |
+
"severity": "critical",
|
| 67 |
+
"description": "Blocks requests to share health information or medical records, protecting HIPAA compliance"
|
| 68 |
+
},
|
| 69 |
+
{
|
| 70 |
+
"rule": "No disclosure of children's personal information",
|
| 71 |
+
"pattern": ".*(child|minor|under.?18|coppa).*",
|
| 72 |
+
"severity": "critical",
|
| 73 |
+
"description": "Prevents disclosure of personal information of children under 18, ensuring COPPA compliance"
|
| 74 |
+
},
|
| 75 |
+
{
|
| 76 |
+
"rule": "Block requests to delete system logs",
|
| 77 |
+
"pattern": ".*(delete.?log|remove.?log|clear.?log|purge.?log).*",
|
| 78 |
+
"severity": "high",
|
| 79 |
+
"description": "Prevents deletion or modification of system logs, which are critical for security and compliance"
|
| 80 |
+
},
|
| 81 |
+
{
|
| 82 |
+
"rule": "Prevent unauthorized system configuration changes",
|
| 83 |
+
"pattern": ".*(system.?config|change.?setting|modify.?config|update.?config).*",
|
| 84 |
+
"severity": "high",
|
| 85 |
+
"description": "Blocks unauthorized changes to system configuration that could compromise security"
|
| 86 |
+
},
|
| 87 |
+
{
|
| 88 |
+
"rule": "No sharing of infrastructure credentials",
|
| 89 |
+
"pattern": ".*(infrastructure|server.?credential|deployment.?key|cloud.?access).*",
|
| 90 |
+
"severity": "critical",
|
| 91 |
+
"description": "Prevents sharing of infrastructure credentials, server access, or cloud deployment keys"
|
| 92 |
+
},
|
| 93 |
+
{
|
| 94 |
+
"rule": "Block requests for generating harmful content",
|
| 95 |
+
"pattern": ".*(harmful|violent|hate.?speech|offensive|illegal).*",
|
| 96 |
+
"severity": "medium",
|
| 97 |
+
"description": "Prevents generation of harmful, violent, hateful, or illegal content"
|
| 98 |
+
},
|
| 99 |
+
{
|
| 100 |
+
"rule": "Prevent creation of misleading information",
|
| 101 |
+
"pattern": ".*(misleading|fake.?news|false.?info|disinformation).*",
|
| 102 |
+
"severity": "medium",
|
| 103 |
+
"description": "Blocks creation of misleading information, fake news, or disinformation"
|
| 104 |
+
},
|
| 105 |
+
{
|
| 106 |
+
"rule": "No sharing of defamatory statements",
|
| 107 |
+
"pattern": ".*(defamatory|libel|slander|defame).*",
|
| 108 |
+
"severity": "medium",
|
| 109 |
+
"description": "Prevents creation or sharing of defamatory statements that could cause legal issues"
|
| 110 |
+
},
|
| 111 |
+
{
|
| 112 |
+
"rule": "Block requests for competitor pricing information",
|
| 113 |
+
"pattern": ".*(competitor|pricing|competitive.?intelligence).*",
|
| 114 |
+
"severity": "low",
|
| 115 |
+
"description": "Prevents sharing of competitor pricing information or competitive intelligence"
|
| 116 |
+
},
|
| 117 |
+
{
|
| 118 |
+
"rule": "Prevent sharing of upcoming product launch details",
|
| 119 |
+
"pattern": ".*(product.?launch|upcoming.?release|new.?product).*",
|
| 120 |
+
"severity": "medium",
|
| 121 |
+
"description": "Blocks disclosure of upcoming product launches or new product information"
|
| 122 |
+
}
|
| 123 |
+
],
|
| 124 |
+
"usage_instructions": {
|
| 125 |
+
"simple": "Copy rules from example_rules.txt and paste into Gradio UI",
|
| 126 |
+
"detailed": "Use the JSON format with patterns and severity levels for more control",
|
| 127 |
+
"bulk_upload": "Use the /admin/rules/bulk endpoint with the rules array",
|
| 128 |
+
"individual": "Add rules one by one using the /admin/rules endpoint with JSON payload"
|
| 129 |
+
}
|
| 130 |
+
}
|
| 131 |
+
|
frontend/app/admin-rules/page.tsx
CHANGED
|
@@ -1,6 +1,6 @@
|
|
| 1 |
"use client";
|
| 2 |
|
| 3 |
-
import { useCallback, useMemo, useState } from "react";
|
| 4 |
import Link from "next/link";
|
| 5 |
|
| 6 |
import { AdminRulesPanel } from "@/components/admin-rules-panel";
|
|
@@ -19,6 +19,14 @@ export default function AdminRulesPage() {
|
|
| 19 |
const [rules, setRules] = useState<string[]>([]);
|
| 20 |
const [loading, setLoading] = useState(false);
|
| 21 |
const [status, setStatus] = useState<StatusState>(null);
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 22 |
|
| 23 |
const headers = useMemo(() => {
|
| 24 |
if (!tenantId.trim()) return undefined;
|
|
@@ -50,6 +58,7 @@ export default function AdminRulesPage() {
|
|
| 50 |
}
|
| 51 |
const data = await response.json();
|
| 52 |
setRules(data.rules ?? []);
|
|
|
|
| 53 |
setStatus({ tone: "success", message: "Rules synced." });
|
| 54 |
} catch (error: any) {
|
| 55 |
setStatus({ tone: "error", message: error.message || "Failed to fetch rules" });
|
|
@@ -63,17 +72,17 @@ export default function AdminRulesPage() {
|
|
| 63 |
const lines = rulesInput
|
| 64 |
.split("\n")
|
| 65 |
.map((line) => line.trim())
|
| 66 |
-
.filter(
|
| 67 |
|
| 68 |
if (!lines.length) {
|
| 69 |
-
setStatus({ tone: "error", message: "Add at least one rule to upload." });
|
| 70 |
return;
|
| 71 |
}
|
| 72 |
|
| 73 |
try {
|
| 74 |
setLoading(true);
|
| 75 |
-
setStatus({ tone: "info", message:
|
| 76 |
-
const response = await fetch(`${BACKEND_BASE_URL}/admin/rules/bulk`, {
|
| 77 |
method: "POST",
|
| 78 |
headers,
|
| 79 |
body: JSON.stringify({ rules: lines }),
|
|
@@ -82,9 +91,11 @@ export default function AdminRulesPage() {
|
|
| 82 |
const details = await response.text();
|
| 83 |
throw new Error(details || `Backend error ${response.status}`);
|
| 84 |
}
|
|
|
|
| 85 |
await handleRefresh();
|
| 86 |
setRulesInput("");
|
| 87 |
-
|
|
|
|
| 88 |
} catch (error: any) {
|
| 89 |
setStatus({ tone: "error", message: error.message || "Failed to upload rules" });
|
| 90 |
} finally {
|
|
@@ -92,6 +103,123 @@ export default function AdminRulesPage() {
|
|
| 92 |
}
|
| 93 |
}, [handleRefresh, headers, requireTenant, rulesInput]);
|
| 94 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 95 |
const handleDelete = useCallback(async () => {
|
| 96 |
if (!requireTenant()) return;
|
| 97 |
if (!deleteInput.trim()) {
|
|
@@ -140,105 +268,258 @@ export default function AdminRulesPage() {
|
|
| 140 |
</Link>
|
| 141 |
</div>
|
| 142 |
</div>
|
| 143 |
-
<
|
| 144 |
-
|
| 145 |
-
|
| 146 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 147 |
</header>
|
| 148 |
|
| 149 |
<AdminRulesPanel />
|
| 150 |
|
| 151 |
<section className="rounded-3xl border border-white/10 bg-white/5 p-6 shadow-2xl shadow-slate-950/40">
|
| 152 |
<div className="flex flex-col gap-6">
|
| 153 |
-
<div className="flex items-center justify-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 154 |
<button
|
| 155 |
onClick={handleRefresh}
|
| 156 |
disabled={loading}
|
| 157 |
-
className="flex-
|
| 158 |
>
|
| 159 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 160 |
</button>
|
| 161 |
</div>
|
| 162 |
|
| 163 |
-
<div className="grid gap-
|
| 164 |
-
|
| 165 |
-
|
| 166 |
-
<
|
| 167 |
-
|
| 168 |
-
|
| 169 |
-
|
| 170 |
-
|
| 171 |
-
|
| 172 |
-
|
| 173 |
-
|
| 174 |
-
|
| 175 |
-
|
| 176 |
-
|
| 177 |
-
|
| 178 |
-
|
| 179 |
-
|
| 180 |
-
|
| 181 |
-
|
| 182 |
-
|
| 183 |
-
|
| 184 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 185 |
|
| 186 |
-
|
| 187 |
-
<
|
| 188 |
-
|
| 189 |
-
|
| 190 |
-
|
| 191 |
-
|
| 192 |
-
|
| 193 |
-
|
| 194 |
-
|
| 195 |
-
|
| 196 |
-
|
| 197 |
-
|
| 198 |
-
|
| 199 |
-
|
| 200 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 201 |
</div>
|
| 202 |
|
| 203 |
{status && (
|
| 204 |
<div
|
| 205 |
-
className={`rounded-
|
| 206 |
status.tone === "error"
|
| 207 |
-
? "border-rose-500/
|
| 208 |
: status.tone === "success"
|
| 209 |
-
? "border-emerald-500/
|
| 210 |
-
: "border-cyan-500/
|
| 211 |
}`}
|
| 212 |
>
|
| 213 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 214 |
</div>
|
| 215 |
)}
|
| 216 |
|
| 217 |
-
<div className="rounded-2xl border border-white/10 bg-slate-950/
|
| 218 |
-
<div className="flex items-center justify-between border-b border-white/5 px-
|
| 219 |
-
<
|
| 220 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 221 |
</div>
|
| 222 |
-
<div className="overflow-x-auto">
|
| 223 |
-
<table className="w-full text-left text-sm
|
| 224 |
-
<thead className="bg-
|
| 225 |
<tr>
|
| 226 |
-
<th className="px-
|
| 227 |
-
<th className="px-
|
| 228 |
</tr>
|
| 229 |
</thead>
|
| 230 |
<tbody>
|
| 231 |
{rules.length === 0 && (
|
| 232 |
<tr>
|
| 233 |
-
<td colSpan={2} className="px-
|
| 234 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 235 |
</td>
|
| 236 |
</tr>
|
| 237 |
)}
|
| 238 |
{rules.map((rule, idx) => (
|
| 239 |
-
<tr
|
| 240 |
-
|
| 241 |
-
|
|
|
|
|
|
|
|
|
|
| 242 |
</tr>
|
| 243 |
))}
|
| 244 |
</tbody>
|
|
|
|
| 1 |
"use client";
|
| 2 |
|
| 3 |
+
import { useCallback, useMemo, useState, useRef, useEffect } from "react";
|
| 4 |
import Link from "next/link";
|
| 5 |
|
| 6 |
import { AdminRulesPanel } from "@/components/admin-rules-panel";
|
|
|
|
| 19 |
const [rules, setRules] = useState<string[]>([]);
|
| 20 |
const [loading, setLoading] = useState(false);
|
| 21 |
const [status, setStatus] = useState<StatusState>(null);
|
| 22 |
+
const [isDragging, setIsDragging] = useState(false);
|
| 23 |
+
const [lastUpdated, setLastUpdated] = useState<string>("");
|
| 24 |
+
const fileInputRef = useRef<HTMLInputElement>(null);
|
| 25 |
+
|
| 26 |
+
// Set initial time only on client side to avoid hydration mismatch
|
| 27 |
+
useEffect(() => {
|
| 28 |
+
setLastUpdated(new Date().toLocaleTimeString());
|
| 29 |
+
}, []);
|
| 30 |
|
| 31 |
const headers = useMemo(() => {
|
| 32 |
if (!tenantId.trim()) return undefined;
|
|
|
|
| 58 |
}
|
| 59 |
const data = await response.json();
|
| 60 |
setRules(data.rules ?? []);
|
| 61 |
+
setLastUpdated(new Date().toLocaleTimeString());
|
| 62 |
setStatus({ tone: "success", message: "Rules synced." });
|
| 63 |
} catch (error: any) {
|
| 64 |
setStatus({ tone: "error", message: error.message || "Failed to fetch rules" });
|
|
|
|
| 72 |
const lines = rulesInput
|
| 73 |
.split("\n")
|
| 74 |
.map((line) => line.trim())
|
| 75 |
+
.filter((line) => line && !line.startsWith("#")); // Filter out comments and empty lines
|
| 76 |
|
| 77 |
if (!lines.length) {
|
| 78 |
+
setStatus({ tone: "error", message: "Add at least one rule to upload. (Comment lines starting with # are ignored)" });
|
| 79 |
return;
|
| 80 |
}
|
| 81 |
|
| 82 |
try {
|
| 83 |
setLoading(true);
|
| 84 |
+
setStatus({ tone: "info", message: `Uploading ${lines.length} rule(s)...` });
|
| 85 |
+
const response = await fetch(`${BACKEND_BASE_URL}/admin/rules/bulk?enhance=true`, {
|
| 86 |
method: "POST",
|
| 87 |
headers,
|
| 88 |
body: JSON.stringify({ rules: lines }),
|
|
|
|
| 91 |
const details = await response.text();
|
| 92 |
throw new Error(details || `Backend error ${response.status}`);
|
| 93 |
}
|
| 94 |
+
const data = await response.json();
|
| 95 |
await handleRefresh();
|
| 96 |
setRulesInput("");
|
| 97 |
+
const enhancedMsg = data.enhanced ? " (enhanced by LLM)" : "";
|
| 98 |
+
setStatus({ tone: "success", message: `Uploaded ${data.added_rules?.length || lines.length} rule(s)${enhancedMsg}.` });
|
| 99 |
} catch (error: any) {
|
| 100 |
setStatus({ tone: "error", message: error.message || "Failed to upload rules" });
|
| 101 |
} finally {
|
|
|
|
| 103 |
}
|
| 104 |
}, [handleRefresh, headers, requireTenant, rulesInput]);
|
| 105 |
|
| 106 |
+
const processFile = useCallback(async (file: File) => {
|
| 107 |
+
if (!requireTenant()) return;
|
| 108 |
+
|
| 109 |
+
const fileExt = file.name.split('.').pop()?.toLowerCase();
|
| 110 |
+
if (!fileExt || !['txt', 'pdf', 'doc', 'docx', 'md'].includes(fileExt)) {
|
| 111 |
+
setStatus({ tone: "error", message: "Unsupported file type. Supported: TXT, PDF, DOC, DOCX, MD" });
|
| 112 |
+
return;
|
| 113 |
+
}
|
| 114 |
+
|
| 115 |
+
try {
|
| 116 |
+
setLoading(true);
|
| 117 |
+
setStatus({ tone: "info", message: `Uploading and processing ${file.name}...` });
|
| 118 |
+
|
| 119 |
+
// For TXT files, read client-side for faster processing
|
| 120 |
+
if (fileExt === 'txt' || fileExt === 'md') {
|
| 121 |
+
const fileContent = await file.text();
|
| 122 |
+
const lines = fileContent
|
| 123 |
+
.split("\n")
|
| 124 |
+
.map((line) => line.trim())
|
| 125 |
+
.filter((line) => line && !line.startsWith("#"));
|
| 126 |
+
|
| 127 |
+
if (!lines.length) {
|
| 128 |
+
setStatus({ tone: "error", message: "No valid rules found in file (after filtering comments)." });
|
| 129 |
+
setLoading(false);
|
| 130 |
+
return;
|
| 131 |
+
}
|
| 132 |
+
|
| 133 |
+
// Upload rules via bulk endpoint
|
| 134 |
+
setStatus({ tone: "info", message: `Uploading ${lines.length} rule(s)...` });
|
| 135 |
+
const response = await fetch(`${BACKEND_BASE_URL}/admin/rules/bulk?enhance=true`, {
|
| 136 |
+
method: "POST",
|
| 137 |
+
headers,
|
| 138 |
+
body: JSON.stringify({ rules: lines }),
|
| 139 |
+
});
|
| 140 |
+
|
| 141 |
+
if (!response.ok) {
|
| 142 |
+
const details = await response.text();
|
| 143 |
+
throw new Error(details || `Backend error ${response.status}`);
|
| 144 |
+
}
|
| 145 |
+
|
| 146 |
+
const data = await response.json();
|
| 147 |
+
await handleRefresh();
|
| 148 |
+
const enhancedMsg = data.enhanced ? " (enhanced by LLM)" : "";
|
| 149 |
+
setStatus({ tone: "success", message: `Uploaded ${data.added_rules?.length || lines.length} rule(s) from ${file.name}${enhancedMsg}.` });
|
| 150 |
+
return;
|
| 151 |
+
}
|
| 152 |
+
|
| 153 |
+
// For PDF, DOC, DOCX - use backend file upload endpoint
|
| 154 |
+
const formData = new FormData();
|
| 155 |
+
formData.append('file', file);
|
| 156 |
+
|
| 157 |
+
setStatus({ tone: "info", message: `Extracting text from ${file.name}...` });
|
| 158 |
+
const response = await fetch(`${BACKEND_BASE_URL}/admin/rules/upload-file?enhance=true`, {
|
| 159 |
+
method: "POST",
|
| 160 |
+
headers: {
|
| 161 |
+
"x-tenant-id": tenantId.trim(),
|
| 162 |
+
},
|
| 163 |
+
body: formData,
|
| 164 |
+
});
|
| 165 |
+
|
| 166 |
+
if (!response.ok) {
|
| 167 |
+
const details = await response.text();
|
| 168 |
+
throw new Error(details || `Backend error ${response.status}`);
|
| 169 |
+
}
|
| 170 |
+
|
| 171 |
+
const data = await response.json();
|
| 172 |
+
await handleRefresh();
|
| 173 |
+
const enhancedMsg = data.enhanced ? " (enhanced by LLM)" : "";
|
| 174 |
+
setStatus({
|
| 175 |
+
tone: "success",
|
| 176 |
+
message: `Uploaded ${data.added_rules?.length || data.total_extracted || 0} rule(s) from ${file.name}${enhancedMsg}.`
|
| 177 |
+
});
|
| 178 |
+
} catch (error: any) {
|
| 179 |
+
setStatus({ tone: "error", message: error.message || "Failed to upload rules from file" });
|
| 180 |
+
} finally {
|
| 181 |
+
setLoading(false);
|
| 182 |
+
}
|
| 183 |
+
}, [handleRefresh, headers, requireTenant, tenantId]);
|
| 184 |
+
|
| 185 |
+
const handleFileUpload = useCallback(async (event: React.ChangeEvent<HTMLInputElement>) => {
|
| 186 |
+
const file = event.target.files?.[0];
|
| 187 |
+
if (!file) return;
|
| 188 |
+
await processFile(file);
|
| 189 |
+
if (fileInputRef.current) {
|
| 190 |
+
fileInputRef.current.value = "";
|
| 191 |
+
}
|
| 192 |
+
}, [processFile]);
|
| 193 |
+
|
| 194 |
+
const handleDragOver = useCallback((e: React.DragEvent) => {
|
| 195 |
+
e.preventDefault();
|
| 196 |
+
e.stopPropagation();
|
| 197 |
+
setIsDragging(true);
|
| 198 |
+
}, []);
|
| 199 |
+
|
| 200 |
+
const handleDragLeave = useCallback((e: React.DragEvent) => {
|
| 201 |
+
e.preventDefault();
|
| 202 |
+
e.stopPropagation();
|
| 203 |
+
setIsDragging(false);
|
| 204 |
+
}, []);
|
| 205 |
+
|
| 206 |
+
const handleDrop = useCallback(async (e: React.DragEvent) => {
|
| 207 |
+
e.preventDefault();
|
| 208 |
+
e.stopPropagation();
|
| 209 |
+
setIsDragging(false);
|
| 210 |
+
|
| 211 |
+
const file = e.dataTransfer.files?.[0];
|
| 212 |
+
if (!file) return;
|
| 213 |
+
|
| 214 |
+
const fileExt = file.name.split('.').pop()?.toLowerCase();
|
| 215 |
+
if (!fileExt || !['txt', 'pdf', 'doc', 'docx', 'md'].includes(fileExt)) {
|
| 216 |
+
setStatus({ tone: "error", message: "Unsupported file type. Supported: TXT, PDF, DOC, DOCX, MD" });
|
| 217 |
+
return;
|
| 218 |
+
}
|
| 219 |
+
|
| 220 |
+
await processFile(file);
|
| 221 |
+
}, [processFile]);
|
| 222 |
+
|
| 223 |
const handleDelete = useCallback(async () => {
|
| 224 |
if (!requireTenant()) return;
|
| 225 |
if (!deleteInput.trim()) {
|
|
|
|
| 268 |
</Link>
|
| 269 |
</div>
|
| 270 |
</div>
|
| 271 |
+
<div className="space-y-2">
|
| 272 |
+
<p className="text-sm text-slate-300">
|
| 273 |
+
Upload governance policies, compliance workflows, and red-flag patterns. Rules are automatically enhanced by LLM and stored in the backend.
|
| 274 |
+
</p>
|
| 275 |
+
<div className="flex flex-wrap gap-2 text-xs text-slate-400">
|
| 276 |
+
<span className="flex items-center gap-1 rounded-full bg-white/5 px-3 py-1">
|
| 277 |
+
<span>β¨</span>
|
| 278 |
+
<span>LLM Enhanced</span>
|
| 279 |
+
</span>
|
| 280 |
+
<span className="flex items-center gap-1 rounded-full bg-white/5 px-3 py-1">
|
| 281 |
+
<span>π</span>
|
| 282 |
+
<span>File Upload</span>
|
| 283 |
+
</span>
|
| 284 |
+
<span className="flex items-center gap-1 rounded-full bg-white/5 px-3 py-1">
|
| 285 |
+
<span>π</span>
|
| 286 |
+
<span>Chunk Processing</span>
|
| 287 |
+
</span>
|
| 288 |
+
</div>
|
| 289 |
+
</div>
|
| 290 |
</header>
|
| 291 |
|
| 292 |
<AdminRulesPanel />
|
| 293 |
|
| 294 |
<section className="rounded-3xl border border-white/10 bg-white/5 p-6 shadow-2xl shadow-slate-950/40">
|
| 295 |
<div className="flex flex-col gap-6">
|
| 296 |
+
<div className="flex items-center justify-between gap-3">
|
| 297 |
+
{lastUpdated && (
|
| 298 |
+
<div className="flex items-center gap-2 text-sm text-slate-400">
|
| 299 |
+
<span>π</span>
|
| 300 |
+
<span>Last updated: {lastUpdated}</span>
|
| 301 |
+
</div>
|
| 302 |
+
)}
|
| 303 |
+
{!lastUpdated && <div></div>}
|
| 304 |
<button
|
| 305 |
onClick={handleRefresh}
|
| 306 |
disabled={loading}
|
| 307 |
+
className="flex items-center gap-2 rounded-xl bg-gradient-to-r from-cyan-400 to-blue-500 px-6 py-3 text-sm font-semibold text-slate-950 shadow-lg shadow-cyan-500/30 transition hover:shadow-cyan-500/50 disabled:opacity-50 disabled:cursor-not-allowed"
|
| 308 |
>
|
| 309 |
+
{loading ? (
|
| 310 |
+
<>
|
| 311 |
+
<span className="animate-spin">β³</span>
|
| 312 |
+
<span>Refreshing...</span>
|
| 313 |
+
</>
|
| 314 |
+
) : (
|
| 315 |
+
<>
|
| 316 |
+
<span>π</span>
|
| 317 |
+
<span>Refresh Rules</span>
|
| 318 |
+
</>
|
| 319 |
+
)}
|
| 320 |
</button>
|
| 321 |
</div>
|
| 322 |
|
| 323 |
+
<div className="grid gap-6 lg:grid-cols-2">
|
| 324 |
+
{/* Left Column: Upload Rules */}
|
| 325 |
+
<div className="flex flex-col gap-4 rounded-2xl border border-white/10 bg-slate-900/30 p-6">
|
| 326 |
+
<div className="flex items-center gap-2">
|
| 327 |
+
<span className="text-lg">π</span>
|
| 328 |
+
<h3 className="text-lg font-semibold text-slate-200">Add Rules</h3>
|
| 329 |
+
</div>
|
| 330 |
+
|
| 331 |
+
<label className="flex flex-col gap-2 text-sm font-semibold text-slate-200">
|
| 332 |
+
<span>Bulk Upload Rules (one per line)</span>
|
| 333 |
+
<textarea
|
| 334 |
+
value={rulesInput}
|
| 335 |
+
onChange={(e) => setRulesInput(e.target.value)}
|
| 336 |
+
placeholder="Block password disclosure requests\nPrevent sharing of API keys\nNo sharing of credit card information"
|
| 337 |
+
rows={8}
|
| 338 |
+
className="rounded-xl border border-white/10 bg-slate-900/50 px-4 py-3 text-sm text-white placeholder:text-slate-500 outline-none ring-0 transition focus:border-cyan-400 focus:ring-2 focus:ring-cyan-400/20"
|
| 339 |
+
/>
|
| 340 |
+
<span className="text-xs text-slate-400">
|
| 341 |
+
π‘ Tip: Comment lines (starting with #) are automatically ignored
|
| 342 |
+
</span>
|
| 343 |
+
</label>
|
| 344 |
+
|
| 345 |
+
<button
|
| 346 |
+
onClick={handleUpload}
|
| 347 |
+
disabled={loading || !rulesInput.trim()}
|
| 348 |
+
className="rounded-xl bg-gradient-to-r from-emerald-400 to-lime-400 px-6 py-3 text-sm font-semibold text-slate-900 shadow-lg shadow-emerald-500/30 transition hover:shadow-emerald-500/50 disabled:opacity-50 disabled:cursor-not-allowed"
|
| 349 |
+
>
|
| 350 |
+
{loading ? "β³ Uploading..." : "β
Upload / Append Rules"}
|
| 351 |
+
</button>
|
| 352 |
+
|
| 353 |
+
<div className="flex items-center gap-3 py-2">
|
| 354 |
+
<span className="h-px flex-1 bg-white/10"></span>
|
| 355 |
+
<span className="text-xs font-semibold uppercase tracking-wider text-slate-400">OR</span>
|
| 356 |
+
<span className="h-px flex-1 bg-white/10"></span>
|
| 357 |
+
</div>
|
| 358 |
+
|
| 359 |
+
<label className="flex flex-col gap-2 text-sm font-semibold text-slate-200">
|
| 360 |
+
<span>π Upload Rules from File</span>
|
| 361 |
+
<div
|
| 362 |
+
onDragOver={handleDragOver}
|
| 363 |
+
onDragLeave={handleDragLeave}
|
| 364 |
+
onDrop={handleDrop}
|
| 365 |
+
className={`relative rounded-xl border-2 border-dashed transition-all ${
|
| 366 |
+
isDragging
|
| 367 |
+
? "border-cyan-400 bg-cyan-500/10 scale-[1.02]"
|
| 368 |
+
: "border-white/20 bg-slate-900/50 hover:border-cyan-400/50 hover:bg-slate-900/70"
|
| 369 |
+
}`}
|
| 370 |
+
>
|
| 371 |
+
<input
|
| 372 |
+
ref={fileInputRef}
|
| 373 |
+
type="file"
|
| 374 |
+
accept=".txt,.pdf,.doc,.docx,.md"
|
| 375 |
+
onChange={handleFileUpload}
|
| 376 |
+
disabled={loading}
|
| 377 |
+
className="hidden"
|
| 378 |
+
id="file-upload-input"
|
| 379 |
+
/>
|
| 380 |
+
<label
|
| 381 |
+
htmlFor="file-upload-input"
|
| 382 |
+
className="flex flex-col items-center justify-center gap-3 p-8 cursor-pointer"
|
| 383 |
+
>
|
| 384 |
+
{isDragging ? (
|
| 385 |
+
<>
|
| 386 |
+
<span className="text-4xl animate-bounce">π₯</span>
|
| 387 |
+
<span className="text-sm font-semibold text-cyan-300">Drop file here</span>
|
| 388 |
+
</>
|
| 389 |
+
) : (
|
| 390 |
+
<>
|
| 391 |
+
<span className="text-4xl">π</span>
|
| 392 |
+
<div className="text-center">
|
| 393 |
+
<span className="text-sm font-semibold text-slate-200">
|
| 394 |
+
Drag & drop file here
|
| 395 |
+
</span>
|
| 396 |
+
<span className="text-xs text-slate-400 block mt-1">or click to browse</span>
|
| 397 |
+
</div>
|
| 398 |
+
<button
|
| 399 |
+
type="button"
|
| 400 |
+
disabled={loading}
|
| 401 |
+
className="mt-2 rounded-lg bg-gradient-to-r from-cyan-500 to-blue-500 px-4 py-2 text-xs font-semibold text-slate-900 transition hover:from-cyan-400 hover:to-blue-400 disabled:opacity-50"
|
| 402 |
+
>
|
| 403 |
+
Choose File
|
| 404 |
+
</button>
|
| 405 |
+
</>
|
| 406 |
+
)}
|
| 407 |
+
</label>
|
| 408 |
+
</div>
|
| 409 |
+
<span className="text-xs text-slate-400">
|
| 410 |
+
Supported: TXT, PDF, DOC, DOCX, MD β’ Files processed server-side with LLM enhancement
|
| 411 |
+
</span>
|
| 412 |
+
</label>
|
| 413 |
+
</div>
|
| 414 |
|
| 415 |
+
{/* Right Column: Delete Rules */}
|
| 416 |
+
<div className="flex flex-col gap-4 rounded-2xl border border-white/10 bg-slate-900/30 p-6">
|
| 417 |
+
<div className="flex items-center gap-2">
|
| 418 |
+
<span className="text-lg">ποΈ</span>
|
| 419 |
+
<h3 className="text-lg font-semibold text-slate-200">Delete Rule</h3>
|
| 420 |
+
</div>
|
| 421 |
+
|
| 422 |
+
<label className="flex flex-col gap-2 text-sm font-semibold text-slate-200">
|
| 423 |
+
<span>Enter exact rule text to remove</span>
|
| 424 |
+
<textarea
|
| 425 |
+
value={deleteInput}
|
| 426 |
+
onChange={(e) => setDeleteInput(e.target.value)}
|
| 427 |
+
placeholder="Paste the exact rule text here to delete it..."
|
| 428 |
+
rows={8}
|
| 429 |
+
className="rounded-xl border border-white/10 bg-slate-900/50 px-4 py-3 text-sm text-white placeholder:text-slate-500 outline-none ring-0 transition focus:border-rose-400 focus:ring-2 focus:ring-rose-400/20"
|
| 430 |
+
/>
|
| 431 |
+
<span className="text-xs text-slate-400">
|
| 432 |
+
β οΈ This action cannot be undone. Make sure the text matches exactly.
|
| 433 |
+
</span>
|
| 434 |
+
</label>
|
| 435 |
+
|
| 436 |
+
<button
|
| 437 |
+
onClick={handleDelete}
|
| 438 |
+
disabled={loading || !deleteInput.trim()}
|
| 439 |
+
className="rounded-xl border-2 border-rose-500 bg-rose-500/10 px-6 py-3 text-sm font-semibold text-rose-300 transition hover:bg-rose-500/20 hover:border-rose-400 disabled:opacity-50 disabled:cursor-not-allowed"
|
| 440 |
+
>
|
| 441 |
+
{loading ? "β³ Deleting..." : "ποΈ Delete Rule"}
|
| 442 |
+
</button>
|
| 443 |
+
</div>
|
| 444 |
</div>
|
| 445 |
|
| 446 |
{status && (
|
| 447 |
<div
|
| 448 |
+
className={`rounded-xl border-2 px-5 py-4 text-sm font-medium shadow-lg ${
|
| 449 |
status.tone === "error"
|
| 450 |
+
? "border-rose-500/50 bg-rose-500/10 text-rose-200 shadow-rose-500/20"
|
| 451 |
: status.tone === "success"
|
| 452 |
+
? "border-emerald-500/50 bg-emerald-500/10 text-emerald-200 shadow-emerald-500/20"
|
| 453 |
+
: "border-cyan-500/50 bg-cyan-500/10 text-cyan-200 shadow-cyan-500/20"
|
| 454 |
}`}
|
| 455 |
>
|
| 456 |
+
<div className="flex items-start gap-3">
|
| 457 |
+
<span className="text-lg">
|
| 458 |
+
{status.tone === "error" ? "β" : status.tone === "success" ? "β
" : "βΉοΈ"}
|
| 459 |
+
</span>
|
| 460 |
+
<span className="flex-1">{status.message}</span>
|
| 461 |
+
</div>
|
| 462 |
</div>
|
| 463 |
)}
|
| 464 |
|
| 465 |
+
<div className="rounded-2xl border border-white/10 bg-gradient-to-br from-slate-900/60 to-slate-950/60 shadow-xl">
|
| 466 |
+
<div className="flex items-center justify-between border-b border-white/10 bg-white/5 px-6 py-4">
|
| 467 |
+
<div className="flex items-center gap-3">
|
| 468 |
+
<span className="text-xl">π</span>
|
| 469 |
+
<h3 className="text-base font-semibold uppercase tracking-[0.2em] text-slate-300">Rule Set</h3>
|
| 470 |
+
</div>
|
| 471 |
+
<div className="flex items-center gap-3">
|
| 472 |
+
<button
|
| 473 |
+
onClick={handleRefresh}
|
| 474 |
+
disabled={loading}
|
| 475 |
+
className="flex items-center gap-2 rounded-lg border border-cyan-500/30 bg-cyan-500/10 px-4 py-2 text-xs font-semibold text-cyan-300 transition hover:bg-cyan-500/20 hover:border-cyan-400/50 disabled:opacity-50 disabled:cursor-not-allowed"
|
| 476 |
+
title="Refresh rules from database"
|
| 477 |
+
>
|
| 478 |
+
{loading ? (
|
| 479 |
+
<>
|
| 480 |
+
<span className="animate-spin">β³</span>
|
| 481 |
+
<span>Refreshing...</span>
|
| 482 |
+
</>
|
| 483 |
+
) : (
|
| 484 |
+
<>
|
| 485 |
+
<span>π</span>
|
| 486 |
+
<span>Refresh</span>
|
| 487 |
+
</>
|
| 488 |
+
)}
|
| 489 |
+
</button>
|
| 490 |
+
<div className="flex items-center gap-2 rounded-full bg-cyan-500/20 px-4 py-2">
|
| 491 |
+
<span className="text-sm font-semibold text-cyan-300">{rules.length}</span>
|
| 492 |
+
<span className="text-xs text-slate-400">entries</span>
|
| 493 |
+
</div>
|
| 494 |
+
</div>
|
| 495 |
</div>
|
| 496 |
+
<div className="overflow-x-auto max-h-[500px] overflow-y-auto">
|
| 497 |
+
<table className="w-full text-left text-sm">
|
| 498 |
+
<thead className="sticky top-0 bg-slate-900/95 backdrop-blur-sm text-xs uppercase tracking-[0.2em] text-slate-400">
|
| 499 |
<tr>
|
| 500 |
+
<th className="px-6 py-4 font-semibold">#</th>
|
| 501 |
+
<th className="px-6 py-4 font-semibold">Rule</th>
|
| 502 |
</tr>
|
| 503 |
</thead>
|
| 504 |
<tbody>
|
| 505 |
{rules.length === 0 && (
|
| 506 |
<tr>
|
| 507 |
+
<td colSpan={2} className="px-6 py-12 text-center">
|
| 508 |
+
<div className="flex flex-col items-center gap-3">
|
| 509 |
+
<span className="text-4xl">π</span>
|
| 510 |
+
<p className="text-slate-400">No rules loaded</p>
|
| 511 |
+
<p className="text-xs text-slate-500">Use the refresh button above to load rules</p>
|
| 512 |
+
</div>
|
| 513 |
</td>
|
| 514 |
</tr>
|
| 515 |
)}
|
| 516 |
{rules.map((rule, idx) => (
|
| 517 |
+
<tr
|
| 518 |
+
key={`${rule}-${idx}`}
|
| 519 |
+
className="border-t border-white/5 transition hover:bg-white/5"
|
| 520 |
+
>
|
| 521 |
+
<td className="px-6 py-4 text-slate-400 font-mono">{idx + 1}</td>
|
| 522 |
+
<td className="px-6 py-4 text-slate-200">{rule}</td>
|
| 523 |
</tr>
|
| 524 |
))}
|
| 525 |
</tbody>
|
frontend/components/admin-rules-panel.tsx
CHANGED
|
@@ -1,32 +1,36 @@
|
|
| 1 |
export function AdminRulesPanel() {
|
| 2 |
const highlights = [
|
| 3 |
{
|
|
|
|
| 4 |
title: "Bulk Upload",
|
| 5 |
-
description: "Paste multiple
|
| 6 |
},
|
| 7 |
{
|
| 8 |
-
|
| 9 |
-
|
|
|
|
| 10 |
},
|
| 11 |
{
|
| 12 |
-
|
| 13 |
-
|
|
|
|
| 14 |
},
|
| 15 |
{
|
|
|
|
| 16 |
title: "Tenant Isolation",
|
| 17 |
description: "Rules are scoped per tenant, ensuring zero data leakage.",
|
| 18 |
},
|
| 19 |
];
|
| 20 |
|
| 21 |
return (
|
| 22 |
-
<div className="rounded-3xl border border-slate-
|
| 23 |
-
<div className="flex flex-col gap-
|
| 24 |
<div>
|
| 25 |
-
<p className="text-sm font-semibold uppercase tracking-[0.3em] text-cyan-400"
|
| 26 |
-
<h2 className="mt-3 text-3xl font-
|
| 27 |
-
<p className="mt-3 text-base text-slate-300">
|
| 28 |
-
Upload governance policies, red-flag keywords, and compliance workflows
|
| 29 |
-
|
| 30 |
</p>
|
| 31 |
</div>
|
| 32 |
|
|
@@ -34,22 +38,18 @@ export function AdminRulesPanel() {
|
|
| 34 |
{highlights.map((item) => (
|
| 35 |
<div
|
| 36 |
key={item.title}
|
| 37 |
-
className="rounded-
|
| 38 |
>
|
| 39 |
-
<
|
| 40 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 41 |
</div>
|
| 42 |
))}
|
| 43 |
</div>
|
| 44 |
-
|
| 45 |
-
<div className="flex flex-wrap gap-3">
|
| 46 |
-
<button className="rounded-full bg-gradient-to-r from-cyan-400 to-blue-500 px-6 py-2 text-sm font-semibold text-slate-950 shadow-lg shadow-cyan-500/30">
|
| 47 |
-
Launch Admin Console
|
| 48 |
-
</button>
|
| 49 |
-
<button className="rounded-full border border-white/20 px-6 py-2 text-sm font-semibold text-white hover:border-cyan-400/60">
|
| 50 |
-
View Rule Templates
|
| 51 |
-
</button>
|
| 52 |
-
</div>
|
| 53 |
</div>
|
| 54 |
</div>
|
| 55 |
);
|
|
|
|
| 1 |
export function AdminRulesPanel() {
|
| 2 |
const highlights = [
|
| 3 |
{
|
| 4 |
+
icon: "π",
|
| 5 |
title: "Bulk Upload",
|
| 6 |
+
description: "Paste multiple rules or upload from files (TXT, PDF, DOC, DOCX).",
|
| 7 |
},
|
| 8 |
{
|
| 9 |
+
icon: "π€",
|
| 10 |
+
title: "LLM Enhancement",
|
| 11 |
+
description: "Rules are automatically enhanced with edge cases and improved patterns.",
|
| 12 |
},
|
| 13 |
{
|
| 14 |
+
icon: "β‘",
|
| 15 |
+
title: "Chunk Processing",
|
| 16 |
+
description: "Large rule sets processed in chunks to avoid timeouts.",
|
| 17 |
},
|
| 18 |
{
|
| 19 |
+
icon: "π",
|
| 20 |
title: "Tenant Isolation",
|
| 21 |
description: "Rules are scoped per tenant, ensuring zero data leakage.",
|
| 22 |
},
|
| 23 |
];
|
| 24 |
|
| 25 |
return (
|
| 26 |
+
<div className="rounded-3xl border border-white/10 bg-gradient-to-br from-slate-900/80 to-slate-950/80 p-8 shadow-2xl shadow-slate-950/40">
|
| 27 |
+
<div className="flex flex-col gap-6">
|
| 28 |
<div>
|
| 29 |
+
<p className="text-sm font-semibold uppercase tracking-[0.3em] text-cyan-400">π‘οΈ Admin Controls</p>
|
| 30 |
+
<h2 className="mt-3 text-3xl font-bold text-white">Admin Rule Management</h2>
|
| 31 |
+
<p className="mt-3 text-base leading-relaxed text-slate-300">
|
| 32 |
+
Upload governance policies, red-flag keywords, and compliance workflows. Rules are automatically enhanced by LLM,
|
| 33 |
+
stored in Supabase/SQLite, and enforced across all MCP toolchains with intelligent pattern matching.
|
| 34 |
</p>
|
| 35 |
</div>
|
| 36 |
|
|
|
|
| 38 |
{highlights.map((item) => (
|
| 39 |
<div
|
| 40 |
key={item.title}
|
| 41 |
+
className="group rounded-xl border border-white/10 bg-white/5 p-5 text-slate-200 transition-all hover:border-cyan-400/40 hover:bg-white/10 hover:shadow-lg hover:shadow-cyan-500/10"
|
| 42 |
>
|
| 43 |
+
<div className="flex items-start gap-3">
|
| 44 |
+
<span className="text-2xl">{item.icon}</span>
|
| 45 |
+
<div>
|
| 46 |
+
<p className="text-sm font-semibold text-cyan-300">{item.title}</p>
|
| 47 |
+
<p className="mt-2 text-sm leading-relaxed text-slate-300">{item.description}</p>
|
| 48 |
+
</div>
|
| 49 |
+
</div>
|
| 50 |
</div>
|
| 51 |
))}
|
| 52 |
</div>
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 53 |
</div>
|
| 54 |
</div>
|
| 55 |
);
|
setup_supabase_table.py
ADDED
|
@@ -0,0 +1,121 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
"""
|
| 2 |
+
Automated Supabase Table Setup
|
| 3 |
+
Creates the admin_rules table in Supabase using the Management API.
|
| 4 |
+
"""
|
| 5 |
+
|
| 6 |
+
import os
|
| 7 |
+
import sys
|
| 8 |
+
from pathlib import Path
|
| 9 |
+
from dotenv import load_dotenv
|
| 10 |
+
|
| 11 |
+
# Load environment variables
|
| 12 |
+
load_dotenv()
|
| 13 |
+
|
| 14 |
+
SUPABASE_URL = os.getenv("SUPABASE_URL")
|
| 15 |
+
SUPABASE_SERVICE_KEY = os.getenv("SUPABASE_SERVICE_KEY")
|
| 16 |
+
|
| 17 |
+
if not SUPABASE_URL or not SUPABASE_SERVICE_KEY:
|
| 18 |
+
print("β Missing Supabase credentials!")
|
| 19 |
+
print(" Please set SUPABASE_URL and SUPABASE_SERVICE_KEY in your .env file")
|
| 20 |
+
sys.exit(1)
|
| 21 |
+
|
| 22 |
+
def create_table_via_supabase():
|
| 23 |
+
"""
|
| 24 |
+
Create table using Supabase client and direct table operations.
|
| 25 |
+
Since Supabase doesn't allow direct SQL execution via REST API,
|
| 26 |
+
we'll create the table structure using the Supabase client.
|
| 27 |
+
"""
|
| 28 |
+
try:
|
| 29 |
+
from supabase import create_client
|
| 30 |
+
|
| 31 |
+
print("π Connecting to Supabase...")
|
| 32 |
+
client = create_client(SUPABASE_URL, SUPABASE_SERVICE_KEY)
|
| 33 |
+
|
| 34 |
+
# Read SQL file
|
| 35 |
+
sql_file = Path(__file__).parent / "supabase_admin_rules_table.sql"
|
| 36 |
+
if not sql_file.exists():
|
| 37 |
+
print(f"β SQL file not found: {sql_file}")
|
| 38 |
+
return False
|
| 39 |
+
|
| 40 |
+
with open(sql_file, "r", encoding="utf-8") as f:
|
| 41 |
+
sql_content = f.read()
|
| 42 |
+
|
| 43 |
+
print("π SQL Script loaded from supabase_admin_rules_table.sql")
|
| 44 |
+
print("\n" + "=" * 60)
|
| 45 |
+
print("β οΈ IMPORTANT: Supabase Python client cannot execute raw SQL")
|
| 46 |
+
print("=" * 60)
|
| 47 |
+
print("\nYou need to run the SQL manually in Supabase Dashboard:")
|
| 48 |
+
print("\nπ Steps:")
|
| 49 |
+
print(" 1. Open: https://app.supabase.com")
|
| 50 |
+
print(" 2. Select your project")
|
| 51 |
+
print(" 3. Go to: SQL Editor (left sidebar)")
|
| 52 |
+
print(" 4. Click: 'New query'")
|
| 53 |
+
print(" 5. Copy the SQL below and paste it:")
|
| 54 |
+
print("\n" + "-" * 60)
|
| 55 |
+
print(sql_content)
|
| 56 |
+
print("-" * 60)
|
| 57 |
+
print("\n 6. Click 'Run' button (or press Ctrl+Enter)")
|
| 58 |
+
print(" 7. Wait for success message")
|
| 59 |
+
print("\nβ
After running, the table will be created!")
|
| 60 |
+
|
| 61 |
+
# Try to verify table exists (after user runs SQL)
|
| 62 |
+
print("\nπ Checking if table exists...")
|
| 63 |
+
try:
|
| 64 |
+
result = client.table("admin_rules").select("id").limit(1).execute()
|
| 65 |
+
print("β
Table 'admin_rules' exists and is accessible!")
|
| 66 |
+
return True
|
| 67 |
+
except Exception as e:
|
| 68 |
+
if "relation" in str(e).lower() or "does not exist" in str(e).lower():
|
| 69 |
+
print("β οΈ Table does not exist yet.")
|
| 70 |
+
print(" Please run the SQL script in Supabase SQL Editor first.")
|
| 71 |
+
return False
|
| 72 |
+
else:
|
| 73 |
+
# Table might be empty, which is fine
|
| 74 |
+
print("β
Table exists (might be empty)")
|
| 75 |
+
return True
|
| 76 |
+
|
| 77 |
+
except ImportError:
|
| 78 |
+
print("β Supabase client not installed")
|
| 79 |
+
print(" Run: pip install supabase")
|
| 80 |
+
return False
|
| 81 |
+
except Exception as e:
|
| 82 |
+
print(f"β Error: {e}")
|
| 83 |
+
return False
|
| 84 |
+
|
| 85 |
+
|
| 86 |
+
def create_table_via_http():
|
| 87 |
+
"""
|
| 88 |
+
Alternative: Try to create table via HTTP POST to Supabase REST API.
|
| 89 |
+
This method uses the PostgREST API to create tables.
|
| 90 |
+
Note: This typically requires admin privileges and may not work.
|
| 91 |
+
"""
|
| 92 |
+
import httpx
|
| 93 |
+
|
| 94 |
+
# This approach won't work because Supabase doesn't allow DDL via REST API
|
| 95 |
+
# But we can try to use the pg_net extension if available
|
| 96 |
+
print("β οΈ Direct HTTP table creation is not supported by Supabase REST API")
|
| 97 |
+
print(" Supabase requires SQL execution via the SQL Editor for security reasons")
|
| 98 |
+
return False
|
| 99 |
+
|
| 100 |
+
|
| 101 |
+
if __name__ == "__main__":
|
| 102 |
+
print("=" * 60)
|
| 103 |
+
print("Supabase Admin Rules Table Setup")
|
| 104 |
+
print("=" * 60)
|
| 105 |
+
print()
|
| 106 |
+
|
| 107 |
+
# Method 1: Try via Supabase client (will show instructions)
|
| 108 |
+
success = create_table_via_supabase()
|
| 109 |
+
|
| 110 |
+
if not success:
|
| 111 |
+
print("\n" + "=" * 60)
|
| 112 |
+
print("π Manual Setup Required")
|
| 113 |
+
print("=" * 60)
|
| 114 |
+
print("\nSince Supabase doesn't allow programmatic SQL execution")
|
| 115 |
+
print("for security reasons, you need to run the SQL manually.")
|
| 116 |
+
print("\nThe SQL script is ready in: supabase_admin_rules_table.sql")
|
| 117 |
+
print("\nAfter running the SQL in Supabase Dashboard:")
|
| 118 |
+
print(" - The table will be created")
|
| 119 |
+
print(" - RulesStore will automatically use Supabase")
|
| 120 |
+
print(" - All rules will be saved to Supabase instead of SQLite")
|
| 121 |
+
|
supabase_admin_rules_table.sql
ADDED
|
@@ -0,0 +1,59 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
-- =============================================================
|
| 2 |
+
-- Supabase Table Schema for Admin Rules
|
| 3 |
+
-- =============================================================
|
| 4 |
+
-- Run this SQL in your Supabase SQL Editor to create the admin_rules table
|
| 5 |
+
-- =============================================================
|
| 6 |
+
|
| 7 |
+
CREATE TABLE IF NOT EXISTS admin_rules (
|
| 8 |
+
id BIGSERIAL PRIMARY KEY,
|
| 9 |
+
tenant_id TEXT NOT NULL,
|
| 10 |
+
rule TEXT NOT NULL,
|
| 11 |
+
pattern TEXT,
|
| 12 |
+
severity TEXT DEFAULT 'medium' CHECK (severity IN ('low', 'medium', 'high', 'critical')),
|
| 13 |
+
description TEXT,
|
| 14 |
+
enabled BOOLEAN DEFAULT true,
|
| 15 |
+
created_at TIMESTAMPTZ DEFAULT NOW(),
|
| 16 |
+
updated_at TIMESTAMPTZ DEFAULT NOW(),
|
| 17 |
+
UNIQUE(tenant_id, rule)
|
| 18 |
+
);
|
| 19 |
+
|
| 20 |
+
-- Create index for faster tenant-based queries
|
| 21 |
+
CREATE INDEX IF NOT EXISTS idx_admin_rules_tenant_id ON admin_rules(tenant_id);
|
| 22 |
+
CREATE INDEX IF NOT EXISTS idx_admin_rules_enabled ON admin_rules(enabled);
|
| 23 |
+
|
| 24 |
+
-- Create index for faster lookups by tenant and enabled status
|
| 25 |
+
CREATE INDEX IF NOT EXISTS idx_admin_rules_tenant_enabled ON admin_rules(tenant_id, enabled);
|
| 26 |
+
|
| 27 |
+
-- Enable Row Level Security (RLS) - optional, adjust based on your needs
|
| 28 |
+
ALTER TABLE admin_rules ENABLE ROW LEVEL SECURITY;
|
| 29 |
+
|
| 30 |
+
-- Create policy to allow service role to access all rows
|
| 31 |
+
-- Adjust this policy based on your security requirements
|
| 32 |
+
CREATE POLICY "Service role can manage all admin rules"
|
| 33 |
+
ON admin_rules
|
| 34 |
+
FOR ALL
|
| 35 |
+
USING (true)
|
| 36 |
+
WITH CHECK (true);
|
| 37 |
+
|
| 38 |
+
-- Create a function to automatically update updated_at timestamp
|
| 39 |
+
CREATE OR REPLACE FUNCTION update_updated_at_column()
|
| 40 |
+
RETURNS TRIGGER AS $$
|
| 41 |
+
BEGIN
|
| 42 |
+
NEW.updated_at = NOW();
|
| 43 |
+
RETURN NEW;
|
| 44 |
+
END;
|
| 45 |
+
$$ language 'plpgsql';
|
| 46 |
+
|
| 47 |
+
-- Create trigger to automatically update updated_at
|
| 48 |
+
CREATE TRIGGER update_admin_rules_updated_at
|
| 49 |
+
BEFORE UPDATE ON admin_rules
|
| 50 |
+
FOR EACH ROW
|
| 51 |
+
EXECUTE FUNCTION update_updated_at_column();
|
| 52 |
+
|
| 53 |
+
-- =============================================================
|
| 54 |
+
-- Example queries to verify the table:
|
| 55 |
+
-- =============================================================
|
| 56 |
+
-- SELECT * FROM admin_rules WHERE tenant_id = 'your_tenant_id';
|
| 57 |
+
-- SELECT * FROM admin_rules WHERE enabled = true;
|
| 58 |
+
-- =============================================================
|
| 59 |
+
|