Spaces:
Sleeping
Sleeping
File size: 6,183 Bytes
affcea5 64725d7 affcea5 5854e13 18c9405 5854e13 18c9405 ada5d3f 73e99ed 18c9405 73e99ed ada5d3f 18c9405 ada5d3f 18c9405 73e99ed 18c9405 ada5d3f 5854e13 ada5d3f 5854e13 71ad4a3 5854e13 ada5d3f 5854e13 71ad4a3 ada5d3f 71ad4a3 ada5d3f 71ad4a3 ada5d3f 71ad4a3 5854e13 ada5d3f 73e99ed ada5d3f 71ad4a3 18c9405 71ad4a3 ada5d3f | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 | ---
title: Enflow Api
emoji: 📉
colorFrom: blue
colorTo: gray
sdk: docker
pinned: false
---
Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference
# Enflow Backend
This is the backend for the Enflow application, which allows law enforcement agencies to build, maintain, and use specific workflows that automate tasks based on officers' daily logs.
## Deployment Status
The backend is currently deployed and running at:
- **API Endpoint**: [https://huggingface.co/spaces/droov/enflow-api](https://huggingface.co/spaces/droov/enflow-api)
## Key Features
- **Intelligent Document Processing**: Extract text from PDF logs using OCR and analyze with GPT-4o-mini
- **Activity Classification**: Automatically categorize activities into defined workflows
- **Markdown Template System**: Create and fill form templates using extracted data
- **Department Management**: Organize users by departments with hierarchical access control
- **PDF Generation**: Convert filled markdown templates to professionally formatted PDFs
## Environment Setup
The backend requires several environment variables to function properly. These can be set up in a `.env` file in the root of the backend directory.
### Environment Variables
Copy the `env.example` file to `.env` and fill in the required values:
```bash
cp env.example .env
```
Then edit the `.env` file with your actual credentials:
- `MONGO_URI`: MongoDB connection string
- `JWT_SECRET`: Secret key for JWT token generation
- `OPENAI_API_KEY`: OpenAI API key for LLM processing (required)
- `REDIS_HOST`: Hostname or IP address of your Redis server
- `REDIS_PORT`: Redis port (default: 6379)
- `REDIS_PASSWORD`: Password for Redis authentication
- `FLASK_ENV`: Set to "development" or "production"
Alternatively, you can set `REDIS_URL` directly as:
```
REDIS_URL=redis://:{password}@{host}:{port}/0
```
### Important Security Notes
- **Never commit the `.env` file to version control**
- Do not expose these credentials in client-side code
- For production, use environment variables provided by your hosting platform rather than a `.env` file
## Running the Application
After setting up the environment variables, you can run the application:
```bash
# Install dependencies
pip install -r requirements.txt
# Run the Flask application
python app.py
# In a separate terminal, run Celery worker for background tasks
celery -A utils.celery_tasks.celery_app worker --loglevel=info
```
## Document Processing Pipeline
The application processes documents in the following steps:
1. **PDF Text Extraction**: OCR processes PDF files to extract text
2. **Activity Extraction**: LLM analyzes log text to identify individual activities
3. **Workflow Classification**: Activities are matched to appropriate workflows
4. **Data Extraction**: Required fields are extracted from activities based on workflow requirements
5. **Form Generation**: Markdown templates are filled with extracted data
6. **PDF Generation**: Filled markdown is rendered as HTML and converted to PDF
This pipeline can run either synchronously or asynchronously using Celery tasks.
## API Documentation
The API endpoints are organized by resource type:
- `/api/auth`: Authentication endpoints
- `/api/departments`: Department management
- `/api/workflows`: Workflow management
- `/api/logs`: Log upload and management
- `/api/incidents`: Incident management
For detailed API documentation, see the `API_DOCUMENTATION.md` file.
## Setup Instructions
### Prerequisites
- Python 3.10 or newer
- Docker and Docker Compose (optional, for containerized deployment)
- MongoDB (we're using MongoDB Atlas in the current setup)
- Redis server (for Celery task queue)
### Environment Setup
1. Clone the repository
2. Run the setup script to create the .env file with the required environment variables:
```
python setup_env.py
```
Or manually create a `.env` file in the backend directory with the following variables:
```
MONGO_URI=your_mongodb_connection_string
JWT_SECRET=your_jwt_secret
OPENAI_API_KEY=your_openai_api_key
REDIS_HOST=your_redis_host
REDIS_PORT=your_redis_port
REDIS_PASSWORD=your_redis_password
FLASK_ENV=development
```
### Local Development
1. Create and activate a virtual environment:
```
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
```
2. Install dependencies:
```
pip install -r requirements.txt
```
3. Run the application:
```
python app.py
```
4. The API will be available at http://localhost:5000
### Docker Deployment
1. Build and start the containers:
```
docker-compose up -d
```
2. The API will be available at http://localhost:5000
### HuggingFace Deployment
The backend is already deployed to HuggingFace Spaces at: [https://huggingface.co/spaces/droov/enflow-api](https://huggingface.co/spaces/droov/enflow-api)
For detailed deployment instructions, see the `hugginface_setup.md` file.
Key points for HuggingFace deployment:
1. Set your Space to use the Docker SDK
2. Add all environment variables in the Space settings
3. Make sure the Redis configuration is properly set up
4. The API will be available at your HuggingFace Space URL
## Project Structure
- `app.py` - Main Flask application
- `db.py` - Database connection and utilities
- `models/` - Data models
- `controllers/` - Controller functions
- `routes/` - API route definitions
- `utils/` - Utility functions and middleware
- `celery_tasks.py` - Celery task definitions
- `pdf_utils.py` - PDF processing and NLP utilities
- `celery_config.py` - Celery configuration for task management
- `setup_env.py` - Script to set up environment variables
- `test_department.py` - Test script for department creation
- `test_auth.py` - Test script for authentication
- `test_hf_deployment.py` - Test script for HuggingFace deployment
- `Dockerfile` - Docker configuration for containerized deployment
- `docker-compose.yml` - Docker Compose configuration for local development
- `API_DOCUMENTATION.md` - Detailed API documentation
- `hugginface_setup.md` - Guide for deploying to HuggingFace Spaces
|