Fix Docker security: Remove .env from image
Browse filesSecurity improvements:
- Removed .env copy from Dockerfile (credentials should never be in images)
- Added .env to .dockerignore
- Created .env.example template for users
- Environment variables now passed via docker-compose or --env-file
Docker build will now succeed and be secure for production deployment.
🤖 Generated with Claude Code
Co-Authored-By: Claude <noreply@anthropic.com>
- .dockerignore +4 -0
- .env.example +39 -0
- Dockerfile +0 -1
.dockerignore
CHANGED
|
@@ -24,6 +24,10 @@ ENV/
|
|
| 24 |
.git/
|
| 25 |
.gitignore
|
| 26 |
|
|
|
|
|
|
|
|
|
|
|
|
|
| 27 |
# Testing
|
| 28 |
.pytest_cache/
|
| 29 |
.coverage
|
|
|
|
| 24 |
.git/
|
| 25 |
.gitignore
|
| 26 |
|
| 27 |
+
# Environment
|
| 28 |
+
.env
|
| 29 |
+
.env.*
|
| 30 |
+
|
| 31 |
# Testing
|
| 32 |
.pytest_cache/
|
| 33 |
.coverage
|
.env.example
ADDED
|
@@ -0,0 +1,39 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Azure OpenAI Configuration
|
| 2 |
+
AZURE_OPENAI_API_KEY=your_azure_openai_api_key_here
|
| 3 |
+
AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com/
|
| 4 |
+
AZURE_OPENAI_API_VERSION=2024-08-01-preview
|
| 5 |
+
|
| 6 |
+
# Azure Document Intelligence (using same credentials as OpenAI for hackathon)
|
| 7 |
+
AZURE_DOCUMENT_INTELLIGENCE_ENDPOINT=https://your-resource.services.ai.azure.com/
|
| 8 |
+
AZURE_DOCUMENT_INTELLIGENCE_KEY=your_document_intelligence_key_here
|
| 9 |
+
|
| 10 |
+
# VM Configuration (Optional)
|
| 11 |
+
VM_HOST=your-vm-host.cloudapp.azure.com
|
| 12 |
+
VM_USER=hackathon
|
| 13 |
+
VM_SSH_KEY=your_ssh_key
|
| 14 |
+
|
| 15 |
+
# HuggingFace Resources (Optional)
|
| 16 |
+
HUGGINGFACE_ORG=https://huggingface.co/SOCARAI
|
| 17 |
+
DATASET_NAME=SOCARAI/ai_track_data
|
| 18 |
+
|
| 19 |
+
# GitHub Code Samples (Optional)
|
| 20 |
+
CODE_SAMPLES_REPO=https://github.com/neaorin/foundry-models-samples
|
| 21 |
+
|
| 22 |
+
# Application Configuration
|
| 23 |
+
DATA_DIR=./data
|
| 24 |
+
PDF_DIR=./data/pdfs
|
| 25 |
+
VECTOR_DB_PATH=./data/vector_db
|
| 26 |
+
PROCESSED_DIR=./data/processed
|
| 27 |
+
|
| 28 |
+
# LLM Model Configuration
|
| 29 |
+
# Available open-source models: DeepSeek-R1, Llama-4-Maverick-17B-128E-Instruct-FP8
|
| 30 |
+
# Using Llama-4-Maverick for optimal speed/quality balance and open-source architecture scores!
|
| 31 |
+
LLM_MODEL=Llama-4-Maverick-17B-128E-Instruct-FP8
|
| 32 |
+
|
| 33 |
+
# API Configuration
|
| 34 |
+
API_HOST=0.0.0.0
|
| 35 |
+
API_PORT=8000
|
| 36 |
+
|
| 37 |
+
# Disable telemetry and warnings
|
| 38 |
+
TOKENIZERS_PARALLELISM=false
|
| 39 |
+
ANONYMIZED_TELEMETRY=false
|
Dockerfile
CHANGED
|
@@ -35,7 +35,6 @@ COPY --from=builder /usr/local/bin /usr/local/bin
|
|
| 35 |
# Copy application code
|
| 36 |
COPY src/ ./src/
|
| 37 |
COPY run.py .
|
| 38 |
-
COPY .env .
|
| 39 |
|
| 40 |
# Create directories for data
|
| 41 |
RUN mkdir -p data/pdfs data/vector_db data/processed
|
|
|
|
| 35 |
# Copy application code
|
| 36 |
COPY src/ ./src/
|
| 37 |
COPY run.py .
|
|
|
|
| 38 |
|
| 39 |
# Create directories for data
|
| 40 |
RUN mkdir -p data/pdfs data/vector_db data/processed
|