Lin / docs /sprint-artifacts /tech_spec_remote_logging.md
Zelyanoth's picture
add redis for job queuing
48e5de1

Tech-Spec 2: Remote Application Logging with Redis

Created: 2025-12-22 Status: Ready for Development

Overview

Problem Statement

Need to centralize application logs in Redis to enable remote access and monitoring of application behavior across multiple Gunicorn workers. This will help with debugging issues like the background task tracking problem and provide better visibility into application performance.

Solution

Implement a logging system that sends application logs to Redis, enabling remote access, centralized monitoring, and real-time log streaming.

Scope (In/Out)

In Scope:

  • Implement Redis-based logging handler
  • Configure log formatting for Redis storage
  • Create log retrieval endpoints
  • Implement log streaming capabilities
  • Add log retention and cleanup policies

Out Scope:

  • Replacing existing logging infrastructure completely
  • Implementing complex log analytics
  • Creating a full dashboard UI

Context for Development

Codebase Patterns

  • Flask application with existing logging infrastructure
  • Multiple Gunicorn workers requiring centralized logging
  • Need for remote log access and debugging

Files to Reference

  • app.py - Main Flask application where logging is configured
  • gunicorn.conf.py - Gunicorn configuration
  • Any existing logging configuration files

Technical Decisions

  • Use Redis lists for log storage with push operations
  • Implement log level filtering and retention
  • Use structured logging format for easier parsing
  • Add log rotation to prevent Redis memory issues

Implementation Plan

Tasks

  • Task 1: Set up Redis connection for logging
  • Task 2: Create custom Python logging handler for Redis
  • Task 3: Configure Flask application to use Redis logger
  • Task 4: Create API endpoints to retrieve logs from Redis
  • Task 5: Implement log retention and cleanup policies
  • Task 6: Test logging functionality with multiple Gunicorn workers

Acceptance Criteria

  • AC 1: Application logs are stored in Redis with proper formatting
  • AC 2: Remote access to logs via API endpoints
  • AC 3: Logs from all Gunicorn workers are consolidated in Redis
  • AC 4: Log retention prevents Redis memory overflow
  • AC 5: Structured logs include timestamp, level, module, and message

Additional Context

Implementation Details

1. Redis Logging Handler

import redis
import json
import logging
from datetime import datetime

class RedisLogHandler(logging.Handler):
    def __init__(self, redis_client, key='app_logs', max_logs=1000):
        super().__init__()
        self.redis_client = redis_client
        self.key = key
        self.max_logs = max_logs

    def emit(self, record):
        try:
            log_entry = {
                'timestamp': datetime.utcnow().isoformat(),
                'level': record.levelname,
                'module': record.module,
                'function': record.funcName,
                'line': record.lineno,
                'message': record.getMessage(),
                'logger': record.name
            }
            # Add exception info if present
            if record.exc_info:
                log_entry['exception'] = self.format(record)
            
            # Push to Redis list
            self.redis_client.lpush(self.key, json.dumps(log_entry))
            # Trim to max_logs to prevent memory issues
            self.redis_client.ltrim(self.key, 0, self.max_logs - 1)
        except Exception:
            self.handleError(record)

2. Flask Integration

import redis
from flask import Flask

# Initialize Redis connection
redis_client = redis.Redis(host='localhost', port=6379, db=0)

# Configure logging
app = Flask(__name__)
redis_handler = RedisLogHandler(redis_client, key='flask_app_logs')
redis_handler.setLevel(logging.INFO)

# Add to Flask app logger
app.logger.addHandler(redis_handler)

3. API Endpoints for Log Retrieval

from flask import jsonify

@app.route('/api/logs')
def get_logs():
    count = request.args.get('count', 50, type=int)
    logs = redis_client.lrange('flask_app_logs', 0, count - 1)
    return jsonify([json.loads(log) for log in logs])

@app.route('/api/logs/levels/<level>')
def get_logs_by_level(level):
    all_logs = redis_client.lrange('flask_app_logs', 0, -1)
    filtered_logs = []
    for log in all_logs:
        log_data = json.loads(log)
        if log_data['level'] == level.upper():
            filtered_logs.append(log_data)
    return jsonify(filtered_logs)

Dependencies

  • redis - Python Redis client
  • flask - For API endpoints

Testing Strategy

  • Unit test the Redis logging handler
  • Integration test log storage and retrieval
  • Test with multiple Gunicorn workers to ensure logs are centralized
  • Test log retention policies

Notes

  • Consider using Redis streams instead of lists for more advanced log querying
  • Implement log rotation to prevent Redis memory issues
  • Add authentication/authorization for log access endpoints
  • Consider using a separate Redis database for logs
  • Monitor Redis memory usage with logging enabled