File size: 5,160 Bytes
48e5de1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
# Tech-Spec 2: Remote Application Logging with Redis

**Created:** 2025-12-22
**Status:** Ready for Development

## Overview

### Problem Statement
Need to centralize application logs in Redis to enable remote access and monitoring of application behavior across multiple Gunicorn workers. This will help with debugging issues like the background task tracking problem and provide better visibility into application performance.

### Solution
Implement a logging system that sends application logs to Redis, enabling remote access, centralized monitoring, and real-time log streaming.

### Scope (In/Out)

**In Scope:**
- Implement Redis-based logging handler
- Configure log formatting for Redis storage
- Create log retrieval endpoints
- Implement log streaming capabilities
- Add log retention and cleanup policies

**Out Scope:**
- Replacing existing logging infrastructure completely
- Implementing complex log analytics
- Creating a full dashboard UI

## Context for Development

### Codebase Patterns
- Flask application with existing logging infrastructure
- Multiple Gunicorn workers requiring centralized logging
- Need for remote log access and debugging

### Files to Reference
- `app.py` - Main Flask application where logging is configured
- `gunicorn.conf.py` - Gunicorn configuration
- Any existing logging configuration files

### Technical Decisions
- Use Redis lists for log storage with push operations
- Implement log level filtering and retention
- Use structured logging format for easier parsing
- Add log rotation to prevent Redis memory issues

## Implementation Plan

### Tasks

- [ ] Task 1: Set up Redis connection for logging
- [ ] Task 2: Create custom Python logging handler for Redis
- [ ] Task 3: Configure Flask application to use Redis logger
- [ ] Task 4: Create API endpoints to retrieve logs from Redis
- [ ] Task 5: Implement log retention and cleanup policies
- [ ] Task 6: Test logging functionality with multiple Gunicorn workers

### Acceptance Criteria

- [ ] AC 1: Application logs are stored in Redis with proper formatting
- [ ] AC 2: Remote access to logs via API endpoints
- [ ] AC 3: Logs from all Gunicorn workers are consolidated in Redis
- [ ] AC 4: Log retention prevents Redis memory overflow
- [ ] AC 5: Structured logs include timestamp, level, module, and message

## Additional Context

### Implementation Details

#### 1. Redis Logging Handler
```python
import redis
import json
import logging
from datetime import datetime

class RedisLogHandler(logging.Handler):
    def __init__(self, redis_client, key='app_logs', max_logs=1000):
        super().__init__()
        self.redis_client = redis_client
        self.key = key
        self.max_logs = max_logs

    def emit(self, record):
        try:
            log_entry = {
                'timestamp': datetime.utcnow().isoformat(),
                'level': record.levelname,
                'module': record.module,
                'function': record.funcName,
                'line': record.lineno,
                'message': record.getMessage(),
                'logger': record.name
            }
            # Add exception info if present
            if record.exc_info:
                log_entry['exception'] = self.format(record)
            
            # Push to Redis list
            self.redis_client.lpush(self.key, json.dumps(log_entry))
            # Trim to max_logs to prevent memory issues
            self.redis_client.ltrim(self.key, 0, self.max_logs - 1)
        except Exception:
            self.handleError(record)
```

#### 2. Flask Integration
```python
import redis
from flask import Flask

# Initialize Redis connection
redis_client = redis.Redis(host='localhost', port=6379, db=0)

# Configure logging
app = Flask(__name__)
redis_handler = RedisLogHandler(redis_client, key='flask_app_logs')
redis_handler.setLevel(logging.INFO)

# Add to Flask app logger
app.logger.addHandler(redis_handler)
```

#### 3. API Endpoints for Log Retrieval
```python
from flask import jsonify

@app.route('/api/logs')
def get_logs():
    count = request.args.get('count', 50, type=int)
    logs = redis_client.lrange('flask_app_logs', 0, count - 1)
    return jsonify([json.loads(log) for log in logs])

@app.route('/api/logs/levels/<level>')
def get_logs_by_level(level):
    all_logs = redis_client.lrange('flask_app_logs', 0, -1)
    filtered_logs = []
    for log in all_logs:
        log_data = json.loads(log)
        if log_data['level'] == level.upper():
            filtered_logs.append(log_data)
    return jsonify(filtered_logs)
```

### Dependencies
- `redis` - Python Redis client
- `flask` - For API endpoints

### Testing Strategy
- Unit test the Redis logging handler
- Integration test log storage and retrieval
- Test with multiple Gunicorn workers to ensure logs are centralized
- Test log retention policies

### Notes
- Consider using Redis streams instead of lists for more advanced log querying
- Implement log rotation to prevent Redis memory issues
- Add authentication/authorization for log access endpoints
- Consider using a separate Redis database for logs
- Monitor Redis memory usage with logging enabled