Upload 9 files
Browse files- README.md +130 -11
- app.js +63 -0
- app.py +234 -0
- architecture-diagram.js +96 -0
- index.html +355 -19
- quantum-viz.js +327 -0
- requirements.txt +8 -0
- simulator.js +644 -0
- styles.css +803 -0
README.md
CHANGED
|
@@ -1,12 +1,131 @@
|
|
| 1 |
-
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
|
| 5 |
-
|
| 6 |
-
|
| 7 |
-
|
| 8 |
-
|
| 9 |
-
|
| 10 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 11 |
|
| 12 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# Quantum-Enhanced WAN 2.1 Video Generation System
|
| 2 |
+
|
| 3 |
+
This project is a web-based, quantum-enhanced video generation interface. It leverages a hybrid approach: a JavaScript frontend for UI and quantum effect visualization, and a Python Flask backend for LLM-guided creative direction and CLIP-based image understanding. The "quantum effect" is the core diffusion model, interpreted and rendered visually on the frontend based on instructions from the AI Director (LLM).
|
| 4 |
+
|
| 5 |
+
## Features
|
| 6 |
+
|
| 7 |
+
- **Hybrid Quantum-Classical Architecture:** Frontend simulates quantum effects as the primary video generation (diffusion) mechanism.
|
| 8 |
+
- **AI Director (LLM-Powered):** Utilizes `chat.completions`-based models (MLC LLM for GGUF) to generate frame-by-frame visual transformation guidance based on user prompts and current frame context.
|
| 9 |
+
- **CLIP-Based Context:** Employs HuggingFace CLIP to understand the visual content of input and intermediate frames.
|
| 10 |
+
- **Dynamic Quantum Visualizations:** `quantum-viz.js` dynamically reacts to "Quantum Influence" and "Entanglement Depth" parameters.
|
| 11 |
+
- **Real-time Effect Rendering:** Frontend `simulator.js` interprets LLM instructions into various visual effects (color shifts, blur, pixelation, glitches, zoom, rotation, bloom, noise, chromatic aberration, ripple, scanlines, vignette).
|
| 12 |
+
- **Movie Recording:** Records generated frame sequences into a WebM video.
|
| 13 |
+
- **Local-First AI:** Prioritizes local GGUF models via MLC LLM for privacy and performance.
|
| 14 |
+
- **HuggingFace Spaces Compatible:** Designed for deployment on HuggingFace Spaces.
|
| 15 |
+
|
| 16 |
+
## Setup Instructions
|
| 17 |
+
|
| 18 |
+
### 1. Clone the Repository
|
| 19 |
+
|
| 20 |
+
```bash
|
| 21 |
+
git clone https://github.com/yourusername/quantum_enhanced_wan_2_1.git
|
| 22 |
+
cd quantum_enhanced_wan_2_1
|
| 23 |
+
```
|
| 24 |
+
|
| 25 |
+
### 2. Frontend Setup (JavaScript)
|
| 26 |
+
|
| 27 |
+
The frontend is a static web application. No special installation steps are required beyond having a modern web browser.
|
| 28 |
+
|
| 29 |
+
### 3. Backend Setup (Python)
|
| 30 |
+
|
| 31 |
+
The backend is a Flask application responsible for LLM and CLIP inference.
|
| 32 |
+
|
| 33 |
+
#### 3.1. Create and Activate Virtual Environment
|
| 34 |
+
|
| 35 |
+
It's highly recommended to use a Python virtual environment:
|
| 36 |
+
|
| 37 |
+
```bash
|
| 38 |
+
python -m venv venv
|
| 39 |
+
# On Windows:
|
| 40 |
+
.\venv\Scripts\activate
|
| 41 |
+
# On macOS/Linux:
|
| 42 |
+
source venv/bin/activate
|
| 43 |
+
```
|
| 44 |
+
|
| 45 |
+
#### 3.2. Install Python Dependencies
|
| 46 |
+
|
| 47 |
+
Navigate to the `backend` directory and install the required packages:
|
| 48 |
+
|
| 49 |
+
```bash
|
| 50 |
+
cd backend
|
| 51 |
+
pip install -r requirements.txt
|
| 52 |
+
cd .. # Go back to the root directory
|
| 53 |
+
```
|
| 54 |
+
|
| 55 |
+
#### 3.3. Download and Compile MLC LLM Model
|
| 56 |
+
|
| 57 |
+
The backend uses MLC LLM for local inference with GGUF models. You need to download and compile a compatible model.
|
| 58 |
|
| 59 |
+
1. **Choose a Model:** Browse Hugging Face Hub for GGUF models. Popular choices include `Llama-2`, `Mistral`, `Gemma`, etc. Ensure it's a chat-tuned model for better instruction following.
|
| 60 |
+
* Example Model: `Llama-2-7b-chat-hf-q4f16_1` (from MLC LLM's own examples)
|
| 61 |
+
|
| 62 |
+
2. **Download and Compile:** Use the `mlc_llm` command-line tool. You need to specify a `--model-path` where the compiled model artifacts will be stored. We've configured the Flask app to look in `./backend/model_artifacts`.
|
| 63 |
+
|
| 64 |
+
```bash
|
| 65 |
+
# Ensure mlc-llm is installed: pip install mlc-llm-nightly -f https://mlc.ai/wheels
|
| 66 |
+
|
| 67 |
+
# Create the directory for model artifacts if it doesn't exist
|
| 68 |
+
mkdir -p backend/model_artifacts
|
| 69 |
+
|
| 70 |
+
# Download and compile your chosen model
|
| 71 |
+
# Replace 'Llama-2-7b-chat-hf-q4f16_1' with your chosen model name if different
|
| 72 |
+
mlc_llm chat Llama-2-7b-chat-hf-q4f16_1 --model-path backend/model_artifacts
|
| 73 |
+
```
|
| 74 |
+
This command will download the model weights and compile them for your system. This process can take a significant amount of time and disk space (several GBs depending on the model). The compiled model will be placed in a subdirectory (e.g., `backend/model_artifacts/Llama-2-7b-chat-hf-q4f16_1`).
|
| 75 |
+
|
| 76 |
+
3. **Verify Configuration in `backend/app.py`:**
|
| 77 |
+
Ensure that `MLC_MODEL_NAME` and `MLC_MODEL_ARTIFACTS_DIR` (or directly `MLC_MODEL_PATH`) in `backend/app.py` match your downloaded model's name and path. By default, they are set to:
|
| 78 |
+
```python
|
| 79 |
+
MLC_MODEL_ARTIFACTS_DIR = os.getenv("MLC_MODEL_ARTIFACTS_DIR", "./backend/model_artifacts")
|
| 80 |
+
MLC_MODEL_NAME = os.getenv("MLC_MODEL_NAME", "Llama-2-7b-chat-hf-q4f16_1")
|
| 81 |
+
```
|
| 82 |
+
|
| 83 |
+
### 4. Running the Application Locally
|
| 84 |
+
|
| 85 |
+
#### 4.1. Start the Backend Server
|
| 86 |
+
|
| 87 |
+
Open a terminal, activate your virtual environment, navigate to the root directory of the project, and run the Flask app:
|
| 88 |
+
|
| 89 |
+
```bash
|
| 90 |
+
# On Windows:
|
| 91 |
+
.\venv\Scripts\activate
|
| 92 |
+
python backend/app.py
|
| 93 |
+
# On macOS/Linux:
|
| 94 |
+
source venv/bin/activate
|
| 95 |
+
python backend/app.py
|
| 96 |
+
```
|
| 97 |
+
The backend should start on `http://localhost:5000`. You will see messages in the terminal indicating if the LLM and CLIP models loaded successfully. *Note: Model loading can take some time.*
|
| 98 |
+
|
| 99 |
+
#### 4.2. Open the Frontend
|
| 100 |
+
|
| 101 |
+
Open the `index.html` file in your web browser. You can typically do this by dragging the file into your browser, or by using a simple local web server (e.g., Python's `http.server` module):
|
| 102 |
+
|
| 103 |
+
```bash
|
| 104 |
+
# In a new terminal, from the root directory:
|
| 105 |
+
python -m http.server 8000
|
| 106 |
+
# Then open your browser to http://localhost:8000
|
| 107 |
+
```
|
| 108 |
+
|
| 109 |
+
## Usage
|
| 110 |
+
|
| 111 |
+
1. **Upload a Source Image:** Use the "UPLOAD SOURCE IMAGE" drop zone to provide an initial image.
|
| 112 |
+
2. **Enter a Prompt:** Describe the desired visual transformation or movie concept in the "Prompt Context" textarea.
|
| 113 |
+
3. **Adjust Quantum Parameters:** Use the "Quantum Influence" and "Entanglement Depth" sliders to control the intensity and complexity of the quantum diffusion effects.
|
| 114 |
+
4. **Toggle Director Mode:** Enable "DIRECTOR MODE" to record the generated frames into a movie.
|
| 115 |
+
5. **Initialize Generation:** Click "INITIALIZE GENERATION" to start the process. The frontend will communicate with the backend for LLM guidance, and then render frames locally using the simulated quantum effects.
|
| 116 |
+
6. **Download Movie:** If in Director Mode, a "SAVE MOVIE" button will appear once frames are generated, allowing you to download the WebM video.
|
| 117 |
+
|
| 118 |
+
## Deployment to Hugging Face Spaces
|
| 119 |
+
|
| 120 |
+
To deploy this application to Hugging Face Spaces, you will need a `Dockerfile` for the backend and potentially some configuration for the frontend.
|
| 121 |
+
|
| 122 |
+
### 1. Backend Dockerfile (coming soon)
|
| 123 |
+
|
| 124 |
+
A `Dockerfile` will be provided in the `backend` directory to containerize the Flask application and its dependencies, including the MLC LLM and CLIP models.
|
| 125 |
+
|
| 126 |
+
### 2. Hugging Face Spaces `app.json` (or similar)
|
| 127 |
+
|
| 128 |
+
Configuration files will guide Hugging Face Spaces on how to build and run your application.
|
| 129 |
+
|
| 130 |
+
---
|
| 131 |
+
**Disclaimer:** This project simulates quantum-enhanced video generation. While it integrates real AI models (LLM, CLIP) for creative guidance, the "quantum effects" and "diffusion" are visual approximations implemented in JavaScript for demonstration purposes, as true real-time quantum image generation is an active research area and not currently feasible in a browser environment.
|
app.js
ADDED
|
@@ -0,0 +1,63 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
// Main application controller
|
| 2 |
+
|
| 3 |
+
class App {
|
| 4 |
+
constructor() {
|
| 5 |
+
this.currentSection = 'overview';
|
| 6 |
+
this.init();
|
| 7 |
+
}
|
| 8 |
+
|
| 9 |
+
init() {
|
| 10 |
+
this.setupNavigation();
|
| 11 |
+
this.updateSliderValues();
|
| 12 |
+
}
|
| 13 |
+
|
| 14 |
+
setupNavigation() {
|
| 15 |
+
const navLinks = document.querySelectorAll('.nav-link');
|
| 16 |
+
|
| 17 |
+
navLinks.forEach(link => {
|
| 18 |
+
link.addEventListener('click', (e) => {
|
| 19 |
+
e.preventDefault();
|
| 20 |
+
const section = link.dataset.section;
|
| 21 |
+
this.navigateTo(section);
|
| 22 |
+
});
|
| 23 |
+
});
|
| 24 |
+
}
|
| 25 |
+
|
| 26 |
+
navigateTo(section) {
|
| 27 |
+
// Update nav links
|
| 28 |
+
document.querySelectorAll('.nav-link').forEach(link => {
|
| 29 |
+
link.classList.remove('active');
|
| 30 |
+
if (link.dataset.section === section) {
|
| 31 |
+
link.classList.add('active');
|
| 32 |
+
}
|
| 33 |
+
});
|
| 34 |
+
|
| 35 |
+
// Update sections
|
| 36 |
+
document.querySelectorAll('.section').forEach(sec => {
|
| 37 |
+
sec.classList.remove('active');
|
| 38 |
+
});
|
| 39 |
+
document.getElementById(section).classList.add('active');
|
| 40 |
+
|
| 41 |
+
this.currentSection = section;
|
| 42 |
+
}
|
| 43 |
+
|
| 44 |
+
updateSliderValues() {
|
| 45 |
+
const quantumInfluence = document.getElementById('quantum-influence');
|
| 46 |
+
const entanglementDepth = document.getElementById('entanglement-depth');
|
| 47 |
+
|
| 48 |
+
if (quantumInfluence) {
|
| 49 |
+
quantumInfluence.addEventListener('input', (e) => {
|
| 50 |
+
document.getElementById('quantum-influence-val').textContent = e.target.value;
|
| 51 |
+
});
|
| 52 |
+
}
|
| 53 |
+
|
| 54 |
+
if (entanglementDepth) {
|
| 55 |
+
entanglementDepth.addEventListener('input', (e) => {
|
| 56 |
+
document.getElementById('entanglement-val').textContent = e.target.value;
|
| 57 |
+
});
|
| 58 |
+
}
|
| 59 |
+
}
|
| 60 |
+
}
|
| 61 |
+
|
| 62 |
+
// Initialize app
|
| 63 |
+
const app = new App();
|
app.py
ADDED
|
@@ -0,0 +1,234 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
from flask import Flask, request, jsonify
|
| 2 |
+
from flask_cors import CORS
|
| 3 |
+
import os
|
| 4 |
+
import base64
|
| 5 |
+
from PIL import Image
|
| 6 |
+
import io
|
| 7 |
+
import torch
|
| 8 |
+
from transformers import CLIPProcessor, CLIPModel
|
| 9 |
+
from mlc_llm import ChatModule
|
| 10 |
+
import threading
|
| 11 |
+
import numpy as np
|
| 12 |
+
import hashlib
|
| 13 |
+
import time
|
| 14 |
+
|
| 15 |
+
app = Flask(__name__)
|
| 16 |
+
CORS(app) # Enable CORS for all routes
|
| 17 |
+
|
| 18 |
+
# --- Configuration for MLC LLM ---
|
| 19 |
+
# IMPORTANT: Before running, ensure you have downloaded and compiled an MLC LLM model.
|
| 20 |
+
# Example:
|
| 21 |
+
# 1. Install MLC LLM: pip install mlc-llm-nightly -f https://mlc.ai/wheels
|
| 22 |
+
# 2. Download a model: mlc_llm chat Llama-2-7b-chat-hf-q4f16_1 --model-path ./model_artifacts
|
| 23 |
+
# (This will create a `dist` folder inside `model_artifacts` with the model.)
|
| 24 |
+
# 3. Update MLC_MODEL_PATH and MLC_MODEL_NAME below to match your downloaded model.
|
| 25 |
+
# E.g., if you run `mlc_llm chat Llama-2-7b-chat-hf-q4f16_1`, it will create a folder
|
| 26 |
+
# like `Llama-2-7b-chat-hf-q4f16_1` within your specified --model-path.
|
| 27 |
+
MLC_MODEL_ARTIFACTS_DIR = os.getenv("MLC_MODEL_ARTIFACTS_DIR", "./backend/model_artifacts")
|
| 28 |
+
MLC_MODEL_NAME = os.getenv("MLC_MODEL_NAME", "Llama-2-7b-chat-hf-q4f16_1")
|
| 29 |
+
MLC_MODEL_PATH = os.path.join(MLC_MODEL_ARTIFACTS_DIR, MLC_MODEL_NAME)
|
| 30 |
+
|
| 31 |
+
# --- Configuration for HuggingFace CLIP ---
|
| 32 |
+
CLIP_MODEL_NAME = "openai/clip-vit-base-patch32"
|
| 33 |
+
|
| 34 |
+
# Global instances for models
|
| 35 |
+
clip_processor = None
|
| 36 |
+
clip_model = None
|
| 37 |
+
mlc_chat_module = None
|
| 38 |
+
mlc_lock = threading.Lock() # To ensure thread-safe access to the LLM
|
| 39 |
+
|
| 40 |
+
def load_mlc_llm_model():
|
| 41 |
+
global mlc_chat_module
|
| 42 |
+
if mlc_chat_module is None:
|
| 43 |
+
print(f"Attempting to load LLM model: {MLC_MODEL_NAME} from {MLC_MODEL_PATH}...")
|
| 44 |
+
try:
|
| 45 |
+
if not os.path.exists(MLC_MODEL_PATH):
|
| 46 |
+
print(f"Error: MLC LLM model path not found: {MLC_MODEL_PATH}")
|
| 47 |
+
print("Please ensure the MLC LLM model is downloaded and compiled in the specified path.")
|
| 48 |
+
print("Refer to installation instructions for mlc-llm and model download commands.")
|
| 49 |
+
return None
|
| 50 |
+
|
| 51 |
+
mlc_chat_module = ChatModule(model=MLC_MODEL_NAME, model_path=MLC_MODEL_PATH)
|
| 52 |
+
print("MLC LLM model loaded successfully.")
|
| 53 |
+
except Exception as e:
|
| 54 |
+
print(f"Error loading MLC LLM model: {e}")
|
| 55 |
+
mlc_chat_module = None
|
| 56 |
+
return mlc_chat_module
|
| 57 |
+
|
| 58 |
+
def load_clip_model():
|
| 59 |
+
global clip_processor, clip_model
|
| 60 |
+
if clip_model is None:
|
| 61 |
+
print(f"Attempting to load CLIP model: {CLIP_MODEL_NAME}...")
|
| 62 |
+
try:
|
| 63 |
+
clip_processor = CLIPProcessor.from_pretrained(CLIP_MODEL_NAME)
|
| 64 |
+
clip_model = CLIPModel.from_pretrained(CLIP_MODEL_NAME)
|
| 65 |
+
# Move model to GPU if available for faster inference
|
| 66 |
+
if torch.cuda.is_available():
|
| 67 |
+
clip_model.to("cuda")
|
| 68 |
+
print("CLIP model moved to CUDA.")
|
| 69 |
+
print("CLIP model loaded successfully.")
|
| 70 |
+
except Exception as e:
|
| 71 |
+
print(f"Error loading CLIP model: {e}")
|
| 72 |
+
clip_processor, clip_model = None, None
|
| 73 |
+
return clip_processor, clip_model
|
| 74 |
+
|
| 75 |
+
# Load models on app startup
|
| 76 |
+
with app.app_context():
|
| 77 |
+
if mlc_chat_module is None:
|
| 78 |
+
load_mlc_llm_model()
|
| 79 |
+
if clip_model is None:
|
| 80 |
+
load_clip_model()
|
| 81 |
+
|
| 82 |
+
@app.route('/')
|
| 83 |
+
def health_check():
|
| 84 |
+
llm_status = "loaded" if mlc_chat_module else "not loaded (check logs)"
|
| 85 |
+
clip_status = "loaded" if clip_model else "not loaded (check logs)"
|
| 86 |
+
return jsonify({
|
| 87 |
+
"status": "Quantum-Enhanced WAN 2.1 Backend is running!",
|
| 88 |
+
"llm_status": llm_status,
|
| 89 |
+
"clip_status": clip_status
|
| 90 |
+
})
|
| 91 |
+
|
| 92 |
+
@app.route('/embed_image', methods=['POST'])
|
| 93 |
+
def embed_image():
|
| 94 |
+
if clip_processor is None or clip_model is None:
|
| 95 |
+
return jsonify({"error": "CLIP model not loaded. Check server logs for details."}),
|
| 96 |
+
500
|
| 97 |
+
|
| 98 |
+
data = request.get_json()
|
| 99 |
+
image_data_url = data.get('image')
|
| 100 |
+
|
| 101 |
+
if not image_data_url:
|
| 102 |
+
return jsonify({"error": "No image data provided"}), 400
|
| 103 |
+
|
| 104 |
+
try:
|
| 105 |
+
header, encoded = image_data_url.split(",", 1)
|
| 106 |
+
image_bytes = base64.b64decode(encoded)
|
| 107 |
+
image = Image.open(io.BytesIO(image_bytes)).convert("RGB")
|
| 108 |
+
|
| 109 |
+
inputs = clip_processor(images=image, return_tensors="pt")
|
| 110 |
+
if torch.cuda.is_available():
|
| 111 |
+
inputs = {k: v.to("cuda") for k, v in inputs.items()}
|
| 112 |
+
|
| 113 |
+
with torch.no_grad():
|
| 114 |
+
image_features = clip_model.get_image_features(**inputs)
|
| 115 |
+
|
| 116 |
+
# Normalize embeddings and convert to list for JSON serialization
|
| 117 |
+
image_embeddings = image_features / image_features.norm(p=2, dim=-1, keepdim=True)
|
| 118 |
+
return jsonify({"embeddings": image_embeddings.squeeze().tolist()})
|
| 119 |
+
|
| 120 |
+
except Exception as e:
|
| 121 |
+
print(f"Error embedding image: {e}")
|
| 122 |
+
return jsonify({"error": f"Failed to embed image: {str(e)}"}), 500
|
| 123 |
+
|
| 124 |
+
@app.route('/chat/completions', methods=['POST'])
|
| 125 |
+
def chat_completions_endpoint():
|
| 126 |
+
if mlc_chat_module is None:
|
| 127 |
+
return jsonify({"error": "LLM model not loaded. Check server logs for details."}),
|
| 128 |
+
500
|
| 129 |
+
|
| 130 |
+
data = request.get_json()
|
| 131 |
+
prompt = data.get("prompt")
|
| 132 |
+
system_message = data.get("system_message", "You are a creative AI assistant for video generation.")
|
| 133 |
+
|
| 134 |
+
if not prompt:
|
| 135 |
+
return jsonify({"error": "Prompt is required"}), 400
|
| 136 |
+
|
| 137 |
+
try:
|
| 138 |
+
full_prompt = f"{system_message}\nUser: {prompt}"
|
| 139 |
+
|
| 140 |
+
with mlc_lock:
|
| 141 |
+
mlc_chat_module.reset_chat()
|
| 142 |
+
response = mlc_chat_module.generate(full_prompt)
|
| 143 |
+
|
| 144 |
+
return jsonify({"completion": response})
|
| 145 |
+
except Exception as e:
|
| 146 |
+
print(f"Error getting chat completion: {e}")
|
| 147 |
+
return jsonify({"error": f"Failed to get chat completion: {str(e)}"}), 500
|
| 148 |
+
|
| 149 |
+
@app.route('/generate_frame_guidance', methods=['POST'])
|
| 150 |
+
def generate_frame_guidance():
|
| 151 |
+
# This endpoint provides LLM guidance for the frontend's quantum diffusion.
|
| 152 |
+
# It does NOT generate the image itself.
|
| 153 |
+
if mlc_chat_module is None or clip_processor is None or clip_model is None:
|
| 154 |
+
return jsonify({"error": "One or more AI models not loaded. Check server logs for details."}),
|
| 155 |
+
500
|
| 156 |
+
|
| 157 |
+
data = request.get_json()
|
| 158 |
+
image_data_url = data.get('image') # The current frame from the frontend
|
| 159 |
+
prompt = data.get('prompt', 'Quantum interpolation')
|
| 160 |
+
influence = data.get('influence', 5) # 0-100
|
| 161 |
+
entanglement_depth = data.get('depth', 16) # For LLM to consider
|
| 162 |
+
frame_number = data.get('frame_number', 0)
|
| 163 |
+
|
| 164 |
+
if not image_data_url:
|
| 165 |
+
return jsonify({"error": "No image data provided"}), 400
|
| 166 |
+
|
| 167 |
+
try:
|
| 168 |
+
# 1. Get CLIP embeddings for the current frame
|
| 169 |
+
header, encoded = image_data_url.split(",", 1)
|
| 170 |
+
image_bytes = base64.b64decode(encoded)
|
| 171 |
+
input_image = Image.open(io.BytesIO(image_bytes)).convert("RGB")
|
| 172 |
+
|
| 173 |
+
clip_inputs = clip_processor(images=input_image, return_tensors="pt")
|
| 174 |
+
if torch.cuda.is_available():
|
| 175 |
+
clip_inputs = {k: v.to("cuda") for k, v in clip_inputs.items()}
|
| 176 |
+
|
| 177 |
+
with torch.no_grad():
|
| 178 |
+
image_features = clip_model.get_image_features(**clip_inputs)
|
| 179 |
+
image_embeddings_np = image_features.squeeze().cpu().numpy()
|
| 180 |
+
embedding_snippet = ", ".join([f"{x:.4f}" for x in image_embeddings_np[:10]])
|
| 181 |
+
|
| 182 |
+
# 2. Use LLM to generate guidance for the next quantum diffusion step
|
| 183 |
+
llm_prompt = (
|
| 184 |
+
f"You are an AI video director for a quantum diffusion system. Your task is to guide the transformation "
|
| 185 |
+
f"of a video frame based on quantum principles and user input. "
|
| 186 |
+
f"Given the current frame's visual context (CLIP features: [{embedding_snippet}...]), "
|
| 187 |
+
f"the user's creative prompt: '{prompt}', "
|
| 188 |
+
f"and the quantum settings (Quantum Influence: {influence}%, Entanglement Depth: {entanglement_depth} layers), "
|
| 189 |
+
f"describe *precisely* how the quantum diffusion effect should transform the current frame into frame {frame_number + 1}. "
|
| 190 |
+
f"Think of these transformations as manipulating a quantum state that manifests visually. "
|
| 191 |
+
f"Higher influence and depth should lead to more pronounced, chaotic, or surreal quantum effects. "
|
| 192 |
+
f"Focus on quantifiable visual parameters, including: "
|
| 193 |
+
f"color shifts (e.g., 'shift red by +{Math.round(influence/5)}', 'hue rotate {Math.round(influence*1.5)}deg'), "
|
| 194 |
+
f"blur (e.g., 'apply gaussian blur radius {Math.max(1, Math.round(influence/10))}'), "
|
| 195 |
+
f"glitch/distortion (e.g., 'pixel displacement x-axis random {Math.max(5, Math.round(influence/5))}px', 'chromatic aberration offset {Math.max(1, Math.round(influence/20))}'), "
|
| 196 |
+
f"zoom/pan (e.g., 'zoom in {1.00 + influence/2000}x, pan right {Math.round(influence/10)}px'), "
|
| 197 |
+
f"pattern overlay (e.g., 'overlay subtle static pattern opacity {influence/200}'), "
|
| 198 |
+
f"motion blur (e.g., 'apply motion blur strength {Math.round(entanglement_depth/2)}'), "
|
| 199 |
+
f"bloom (e.g., 'add bloom strength {influence/100}'), "
|
| 200 |
+
f"noise (e.g., 'add noise amount {influence/50}'), "
|
| 201 |
+
f"vignette (e.g., 'add vignette strength {influence/200}'), "
|
| 202 |
+
f"or specific quantum-themed visual cues (e.g., 'ripple effect', 'add subtle scanlines opacity {influence/200}', 'invert colors'). "
|
| 203 |
+
f"Combine these to create a dynamic, quantum-like visual evolution. Ensure the intensity of effects scales with Influence and Depth. "
|
| 204 |
+
f"Be concise and output only the transformation instructions. "
|
| 205 |
+
f"Example: 'shift blue by +{Math.round(influence/5)}, apply motion blur strength {Math.round(entanglement_depth/2)}, zoom {1.00 + influence/2000}x, add subtle scanlines opacity {influence/200}'.\n"
|
| 206 |
+
f"Transformation Instructions for frame {frame_number + 1}:"
|
| 207 |
+
)
|
| 208 |
+
|
| 209 |
+
llm_guidance = ""
|
| 210 |
+
try:
|
| 211 |
+
with mlc_lock:
|
| 212 |
+
mlc_chat_module.reset_chat()
|
| 213 |
+
llm_guidance = mlc_chat_module.generate(llm_prompt)
|
| 214 |
+
except Exception as llm_e:
|
| 215 |
+
print(f"LLM guidance generation failed: {llm_e}. Using fallback guidance.")
|
| 216 |
+
llm_guidance = f"apply subtle glitch effect, shift colors slightly based on quantum influence {influence}%."
|
| 217 |
+
|
| 218 |
+
print(f"LLM Guidance: {llm_guidance}")
|
| 219 |
+
|
| 220 |
+
return jsonify({
|
| 221 |
+
"guidance": llm_guidance,
|
| 222 |
+
"log": (f"Backend provided guidance for frame {frame_number + 1} based on prompt: '{prompt[:50]}...', "
|
| 223 |
+
f"influence: {influence}, depth: {entanglement_depth}. LLM guidance: '{llm_guidance[:50]}...'.")
|
| 224 |
+
})
|
| 225 |
+
except Exception as e:
|
| 226 |
+
print(f"Error generating frame guidance: {e}")
|
| 227 |
+
return jsonify({"error": f"Failed to generate frame guidance: {str(e)}"}), 500
|
| 228 |
+
|
| 229 |
+
if __name__ == '__main__':
|
| 230 |
+
if not os.path.exists(MLC_MODEL_ARTIFACTS_DIR):
|
| 231 |
+
os.makedirs(MLC_MODEL_ARTIFACTS_DIR)
|
| 232 |
+
print(f"Created model artifacts directory: {MLC_MODEL_ARTIFACTS_DIR}")
|
| 233 |
+
|
| 234 |
+
app.run(debug=True, host='0.0.0.0', port=5000)
|
architecture-diagram.js
ADDED
|
@@ -0,0 +1,96 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
// Architecture visualization
|
| 2 |
+
|
| 3 |
+
class ArchitectureDiagram {
|
| 4 |
+
constructor() {
|
| 5 |
+
this.canvas = document.getElementById('architecture-canvas');
|
| 6 |
+
if (!this.canvas) return;
|
| 7 |
+
|
| 8 |
+
this.ctx = this.canvas.getContext('2d');
|
| 9 |
+
this.resize();
|
| 10 |
+
window.addEventListener('resize', () => this.resize());
|
| 11 |
+
|
| 12 |
+
this.layers = [
|
| 13 |
+
{ name: "CLIENT", y: 0.15, color: "#00f0ff" },
|
| 14 |
+
{ name: "BACKEND (NODE)", y: 0.4, color: "#ffffff" },
|
| 15 |
+
{ name: "QUANTUM (GPU)", y: 0.65, color: "#7000ff" },
|
| 16 |
+
{ name: "STORAGE", y: 0.9, color: "#444444" }
|
| 17 |
+
];
|
| 18 |
+
|
| 19 |
+
this.packets = [];
|
| 20 |
+
this.animate();
|
| 21 |
+
}
|
| 22 |
+
|
| 23 |
+
resize() {
|
| 24 |
+
const rect = this.canvas.getBoundingClientRect();
|
| 25 |
+
this.canvas.width = rect.width;
|
| 26 |
+
this.canvas.height = rect.height;
|
| 27 |
+
}
|
| 28 |
+
|
| 29 |
+
draw() {
|
| 30 |
+
const w = this.canvas.width;
|
| 31 |
+
const h = this.canvas.height;
|
| 32 |
+
this.ctx.clearRect(0,0,w,h);
|
| 33 |
+
|
| 34 |
+
// Draw Layers
|
| 35 |
+
this.layers.forEach(layer => {
|
| 36 |
+
const y = h * layer.y;
|
| 37 |
+
|
| 38 |
+
// Line
|
| 39 |
+
this.ctx.beginPath();
|
| 40 |
+
this.ctx.strokeStyle = layer.color;
|
| 41 |
+
this.ctx.globalAlpha = 0.3;
|
| 42 |
+
this.ctx.lineWidth = 2;
|
| 43 |
+
this.ctx.moveTo(50, y);
|
| 44 |
+
this.ctx.lineTo(w-50, y);
|
| 45 |
+
this.ctx.stroke();
|
| 46 |
+
|
| 47 |
+
// Label
|
| 48 |
+
this.ctx.globalAlpha = 1;
|
| 49 |
+
this.ctx.fillStyle = layer.color;
|
| 50 |
+
this.ctx.font = "12px Space Mono";
|
| 51 |
+
this.ctx.fillText(layer.name, 50, y - 10);
|
| 52 |
+
|
| 53 |
+
// Nodes
|
| 54 |
+
for(let i=1; i<=3; i++) {
|
| 55 |
+
const nx = 50 + (w-100) * (i/4);
|
| 56 |
+
this.ctx.beginPath();
|
| 57 |
+
this.ctx.fillStyle = '#0f111a';
|
| 58 |
+
this.ctx.strokeStyle = layer.color;
|
| 59 |
+
this.ctx.lineWidth = 2;
|
| 60 |
+
this.ctx.rect(nx-15, y-15, 30, 30);
|
| 61 |
+
this.ctx.fill();
|
| 62 |
+
this.ctx.stroke();
|
| 63 |
+
}
|
| 64 |
+
});
|
| 65 |
+
|
| 66 |
+
// Spawn packets
|
| 67 |
+
if (Math.random() > 0.95) {
|
| 68 |
+
this.packets.push({
|
| 69 |
+
x: 50 + (w-100) * (Math.random() * 0.5 + 0.25),
|
| 70 |
+
y: h * this.layers[0].y,
|
| 71 |
+
targetY: h * this.layers[3].y,
|
| 72 |
+
speed: 2 + Math.random() * 2
|
| 73 |
+
});
|
| 74 |
+
}
|
| 75 |
+
|
| 76 |
+
// Draw packets
|
| 77 |
+
this.ctx.fillStyle = '#fff';
|
| 78 |
+
for(let i = this.packets.length - 1; i >= 0; i--) {
|
| 79 |
+
let p = this.packets[i];
|
| 80 |
+
p.y += p.speed;
|
| 81 |
+
|
| 82 |
+
this.ctx.beginPath();
|
| 83 |
+
this.ctx.arc(p.x, p.y, 3, 0, Math.PI*2);
|
| 84 |
+
this.ctx.fill();
|
| 85 |
+
|
| 86 |
+
if(p.y > p.targetY) this.packets.splice(i, 1);
|
| 87 |
+
}
|
| 88 |
+
}
|
| 89 |
+
|
| 90 |
+
animate() {
|
| 91 |
+
this.draw();
|
| 92 |
+
requestAnimationFrame(() => this.animate());
|
| 93 |
+
}
|
| 94 |
+
}
|
| 95 |
+
|
| 96 |
+
new ArchitectureDiagram();
|
index.html
CHANGED
|
@@ -1,19 +1,355 @@
|
|
| 1 |
-
<!
|
| 2 |
-
<html>
|
| 3 |
-
|
| 4 |
-
|
| 5 |
-
|
| 6 |
-
|
| 7 |
-
|
| 8 |
-
|
| 9 |
-
|
| 10 |
-
|
| 11 |
-
|
| 12 |
-
|
| 13 |
-
|
| 14 |
-
|
| 15 |
-
|
| 16 |
-
|
| 17 |
-
|
| 18 |
-
|
| 19 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
<!DOCTYPE html>
|
| 2 |
+
<html lang="en">
|
| 3 |
+
<head>
|
| 4 |
+
<meta charset="UTF-8">
|
| 5 |
+
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
| 6 |
+
<title>Quantum-Enhanced WAN 2.1 | System Simulation</title>
|
| 7 |
+
<link rel="stylesheet" href="styles.css">
|
| 8 |
+
<script src="https://cdnjs.cloudflare.com/ajax/libs/Chart.js/3.9.1/chart.min.js"></script>
|
| 9 |
+
</head>
|
| 10 |
+
<body>
|
| 11 |
+
<div class="app-container">
|
| 12 |
+
<nav class="side-nav">
|
| 13 |
+
<div class="nav-brand">
|
| 14 |
+
<div class="brand-icon">Q</div>
|
| 15 |
+
<span>WAN 2.1</span>
|
| 16 |
+
</div>
|
| 17 |
+
<div class="nav-links">
|
| 18 |
+
<a href="#simulator" class="nav-link active" data-section="simulator">
|
| 19 |
+
<span class="icon">⚡</span> Simulation
|
| 20 |
+
</a>
|
| 21 |
+
<a href="#overview" class="nav-link" data-section="overview">
|
| 22 |
+
<span class="icon">⦿</span> Executive Summary
|
| 23 |
+
</a>
|
| 24 |
+
<a href="#architecture" class="nav-link" data-section="architecture">
|
| 25 |
+
<span class="icon">☊</span> Architecture
|
| 26 |
+
</a>
|
| 27 |
+
<a href="#backend" class="nav-link" data-section="backend">
|
| 28 |
+
<span class="icon">◈</span> Quantum Backend
|
| 29 |
+
</a>
|
| 30 |
+
<a href="#specs" class="nav-link" data-section="specs">
|
| 31 |
+
<span class="icon">⚙</span> Specifications
|
| 32 |
+
</a>
|
| 33 |
+
</div>
|
| 34 |
+
<div class="nav-status">
|
| 35 |
+
<div class="status-row">
|
| 36 |
+
<span>WebGPU</span>
|
| 37 |
+
<span class="status-dot active"></span>
|
| 38 |
+
</div>
|
| 39 |
+
<div class="status-row">
|
| 40 |
+
<span>Qiskit.js</span>
|
| 41 |
+
<span class="status-dot active"></span>
|
| 42 |
+
</div>
|
| 43 |
+
<div class="status-row">
|
| 44 |
+
<span>Qubits</span>
|
| 45 |
+
<span class="mono">512</span>
|
| 46 |
+
</div>
|
| 47 |
+
</div>
|
| 48 |
+
</nav>
|
| 49 |
+
|
| 50 |
+
<main class="content-area">
|
| 51 |
+
<!-- Simulation Section (Default) -->
|
| 52 |
+
<section id="simulator" class="section active">
|
| 53 |
+
<header class="section-header">
|
| 54 |
+
<h1>Live System Simulation</h1>
|
| 55 |
+
<p class="subtitle">Hybrid Quantum-Classical Video Generation Interface</p>
|
| 56 |
+
</header>
|
| 57 |
+
|
| 58 |
+
<div class="simulation-grid">
|
| 59 |
+
<div class="control-panel glass-panel">
|
| 60 |
+
<h3>I2V Input Parameters</h3>
|
| 61 |
+
|
| 62 |
+
<div class="input-group file-drop-zone" id="drop-zone">
|
| 63 |
+
<input type="file" id="image-input" accept="image/*" hidden>
|
| 64 |
+
<div class="drop-content">
|
| 65 |
+
<span class="icon">📁</span>
|
| 66 |
+
<span class="drop-text">UPLOAD SOURCE IMAGE</span>
|
| 67 |
+
<span class="drop-sub">Drag & Drop or Click to Browse</span>
|
| 68 |
+
</div>
|
| 69 |
+
<img id="preview-img" class="hidden" alt="Preview">
|
| 70 |
+
</div>
|
| 71 |
+
|
| 72 |
+
<div class="input-group">
|
| 73 |
+
<label>Prompt Context (CLIP Guidance)</label>
|
| 74 |
+
<textarea id="prompt-input" placeholder="Describe the motion to generate... (e.g., 'Camera zoom with quantum distortion')"></textarea>
|
| 75 |
+
</div>
|
| 76 |
+
|
| 77 |
+
<div class="controls-row">
|
| 78 |
+
<div class="input-group">
|
| 79 |
+
<label>Quantum Influence <span class="value-badge" id="influence-val">5%</span></label>
|
| 80 |
+
<input type="range" id="quantum-influence" min="0" max="100" value="5">
|
| 81 |
+
<div class="slider-meta">Low (Deterministic) — High (Chaotic)</div>
|
| 82 |
+
</div>
|
| 83 |
+
|
| 84 |
+
<div class="input-group">
|
| 85 |
+
<label>Entanglement Depth <span class="value-badge" id="depth-val">16</span></label>
|
| 86 |
+
<input type="range" id="entanglement-depth" min="1" max="16" value="16">
|
| 87 |
+
<div class="slider-meta">Circuit Layers (4-16)</div>
|
| 88 |
+
</div>
|
| 89 |
+
</div>
|
| 90 |
+
|
| 91 |
+
<div class="controls-row">
|
| 92 |
+
<div class="input-group">
|
| 93 |
+
<label>Sampling Method</label>
|
| 94 |
+
<select id="sampling-method">
|
| 95 |
+
<option value="adaptive">Adaptive Quantum Injection</option>
|
| 96 |
+
<option value="direct">Direct Latent Modulation</option>
|
| 97 |
+
<option value="feedback">Continuous Feedback Loop</option>
|
| 98 |
+
</select>
|
| 99 |
+
</div>
|
| 100 |
+
</div>
|
| 101 |
+
|
| 102 |
+
<div class="director-controls glass-panel-inner">
|
| 103 |
+
<div class="director-header">
|
| 104 |
+
<label class="switch-container">
|
| 105 |
+
<span class="label-text">DIRECTOR MODE</span>
|
| 106 |
+
<input type="checkbox" id="director-mode-toggle" checked>
|
| 107 |
+
<span class="toggle-slider"></span>
|
| 108 |
+
</label>
|
| 109 |
+
<span class="frame-count" id="total-frames">0 FRAMES</span>
|
| 110 |
+
</div>
|
| 111 |
+
<div class="director-actions">
|
| 112 |
+
<button id="download-btn" class="btn-secondary" disabled>
|
| 113 |
+
<span class="icon">💾</span> SAVE MOVIE
|
| 114 |
+
</button>
|
| 115 |
+
<button id="reset-movie-btn" class="btn-danger" disabled>
|
| 116 |
+
<span class="icon">✖</span> CLEAR
|
| 117 |
+
</button>
|
| 118 |
+
</div>
|
| 119 |
+
</div>
|
| 120 |
+
|
| 121 |
+
<div class="controls-row">
|
| 122 |
+
<button id="start-btn" class="btn-primary">
|
| 123 |
+
<span class="btn-text">INITIALIZE GENERATION</span>
|
| 124 |
+
<span class="btn-glitch"></span>
|
| 125 |
+
</button>
|
| 126 |
+
</div>
|
| 127 |
+
</div>
|
| 128 |
+
|
| 129 |
+
<div class="visualization-panel glass-panel">
|
| 130 |
+
<div class="viz-tabs">
|
| 131 |
+
<button class="viz-tab active" data-view="output">Video Output</button>
|
| 132 |
+
<button class="viz-tab" data-view="circuit">Quantum Circuit</button>
|
| 133 |
+
<button class="viz-tab" data-view="state">State Vector</button>
|
| 134 |
+
</div>
|
| 135 |
+
|
| 136 |
+
<div class="viz-content">
|
| 137 |
+
<div id="view-output" class="viz-view active">
|
| 138 |
+
<canvas id="output-canvas"></canvas>
|
| 139 |
+
<div class="overlay-stats" id="generation-stats">WAITING FOR INPUT...</div>
|
| 140 |
+
<div class="scanline"></div>
|
| 141 |
+
</div>
|
| 142 |
+
<div id="view-circuit" class="viz-view">
|
| 143 |
+
<canvas id="quantum-circuit-canvas"></canvas>
|
| 144 |
+
</div>
|
| 145 |
+
<div id="view-state" class="viz-view">
|
| 146 |
+
<canvas id="state-vector-canvas"></canvas>
|
| 147 |
+
<div class="entropy-readout">Entanglement Entropy: <span id="entropy-value">0.00</span></div>
|
| 148 |
+
</div>
|
| 149 |
+
</div>
|
| 150 |
+
</div>
|
| 151 |
+
|
| 152 |
+
<div class="terminal-panel glass-panel">
|
| 153 |
+
<div class="terminal-header">
|
| 154 |
+
<span>SYSTEM_LOGS</span>
|
| 155 |
+
<span class="terminal-status">CONNECTED</span>
|
| 156 |
+
</div>
|
| 157 |
+
<div class="terminal-body" id="system-logs">
|
| 158 |
+
<div class="log-line"><span class="ts">[00:00:00]</span> System ready. Waiting for user input...</div>
|
| 159 |
+
</div>
|
| 160 |
+
</div>
|
| 161 |
+
</div>
|
| 162 |
+
</section>
|
| 163 |
+
|
| 164 |
+
<!-- Executive Summary -->
|
| 165 |
+
<section id="overview" class="section">
|
| 166 |
+
<div class="document-wrapper glass-panel">
|
| 167 |
+
<h1>Executive Summary</h1>
|
| 168 |
+
<p class="lead">This document defines a complete rebuild of Alibaba's WAN 2.1 video generation system with a revolutionary web-based quantum compute backend.</p>
|
| 169 |
+
|
| 170 |
+
<p>The system replaces traditional GPU inference with a <strong>hybrid quantum-classical architecture</strong> running entirely in the browser using WebGPU and Qiskit-powered WebWorkers. The core innovation is a novel interface where real quantum circuit evaluations directly influence the diffusion model's latent space, creating a unique AI system where quantum superposition can directly affect generative outputs.</p>
|
| 171 |
+
|
| 172 |
+
<div class="metric-cards">
|
| 173 |
+
<div class="card">
|
| 174 |
+
<div class="metric-val">512</div>
|
| 175 |
+
<div class="metric-label">Qubits (State Matrix)</div>
|
| 176 |
+
</div>
|
| 177 |
+
<div class="card">
|
| 178 |
+
<div class="metric-val">WebGPU</div>
|
| 179 |
+
<div class="metric-label">Compute Engine</div>
|
| 180 |
+
</div>
|
| 181 |
+
<div class="card">
|
| 182 |
+
<div class="metric-val">Local</div>
|
| 183 |
+
<div class="metric-label">Privacy-First Inference</div>
|
| 184 |
+
</div>
|
| 185 |
+
</div>
|
| 186 |
+
|
| 187 |
+
<h3>Core Objectives</h3>
|
| 188 |
+
<ul class="feature-list">
|
| 189 |
+
<li>Reproduce WAN 2.1's video generation capabilities in a web browser.</li>
|
| 190 |
+
<li>Leverage quantum computing for unique generative variability.</li>
|
| 191 |
+
<li>Implement a privacy-first, high-performance video generator on consumer hardware.</li>
|
| 192 |
+
</ul>
|
| 193 |
+
</div>
|
| 194 |
+
</section>
|
| 195 |
+
|
| 196 |
+
<!-- Architecture -->
|
| 197 |
+
<section id="architecture" class="section">
|
| 198 |
+
<div class="document-wrapper glass-panel">
|
| 199 |
+
<h1>System Architecture Overview</h1>
|
| 200 |
+
<p>The quantum-enhanced WAN 2.1 system is composed of four main layers orchestrated to deliver the final video output.</p>
|
| 201 |
+
|
| 202 |
+
<div class="diagram-container">
|
| 203 |
+
<canvas id="architecture-canvas"></canvas>
|
| 204 |
+
</div>
|
| 205 |
+
|
| 206 |
+
<div class="architecture-grid">
|
| 207 |
+
<div class="arch-card">
|
| 208 |
+
<h3>1. Browser Client Layer</h3>
|
| 209 |
+
<p>User-facing front-end running in the browser.</p>
|
| 210 |
+
<ul>
|
| 211 |
+
<li><strong>UI Canvas:</strong> Control center for prompt input and parameter adjustment.</li>
|
| 212 |
+
<li><strong>Quantum Visualizer:</strong> Real-time display of quantum circuits and state vectors.</li>
|
| 213 |
+
<li><strong>Video Player:</strong> Embedded player for generated results.</li>
|
| 214 |
+
<li><strong>Main Thread Orchestrator:</strong> Manages state and coordinates visualization.</li>
|
| 215 |
+
</ul>
|
| 216 |
+
</div>
|
| 217 |
+
<div class="arch-card">
|
| 218 |
+
<h3>2. Quantum Compute Backend</h3>
|
| 219 |
+
<p>Core system leveraging quantum computing for generative influence.</p>
|
| 220 |
+
<ul>
|
| 221 |
+
<li><strong>WebWorker Pool:</strong> 4-8 parallel workers for circuit simulation.</li>
|
| 222 |
+
<li><strong>Qiskit.js:</strong> Circuit builder for custom gate sequences.</li>
|
| 223 |
+
<li><strong>WebGPU Engine:</strong> Accelerates 512D state vector evolution using WGSL shaders.</li>
|
| 224 |
+
<li><strong>State Analyzer:</strong> Computes entanglement entropy and fidelity.</li>
|
| 225 |
+
</ul>
|
| 226 |
+
</div>
|
| 227 |
+
<div class="arch-card">
|
| 228 |
+
<h3>3. Web Backend Server</h3>
|
| 229 |
+
<p>Orchestration and classical deep learning inference (Node.js/Python).</p>
|
| 230 |
+
<ul>
|
| 231 |
+
<li><strong>REST API Gateway:</strong> Handles requests and authentication.</li>
|
| 232 |
+
<li><strong>Quantum-Classical Bridge:</strong> Translates quantum features into diffusion parameters.</li>
|
| 233 |
+
<li><strong>WAN 2.1 Engine:</strong> Distributed inference layer for T5 encoder and VAE/Diffusion models.</li>
|
| 234 |
+
</ul>
|
| 235 |
+
</div>
|
| 236 |
+
<div class="arch-card">
|
| 237 |
+
<h3>4. Storage & Cache</h3>
|
| 238 |
+
<p>Persistence layer for models and results.</p>
|
| 239 |
+
<ul>
|
| 240 |
+
<li><strong>Redis Cache:</strong> In-memory storage for model weights.</li>
|
| 241 |
+
<li><strong>S3/Minio:</strong> Durable storage for generated videos.</li>
|
| 242 |
+
<li><strong>Circuit Library:</strong> Repository of pre-defined quantum circuits.</li>
|
| 243 |
+
</ul>
|
| 244 |
+
</div>
|
| 245 |
+
</div>
|
| 246 |
+
</div>
|
| 247 |
+
</section>
|
| 248 |
+
|
| 249 |
+
<!-- Backend Specs -->
|
| 250 |
+
<section id="backend" class="section">
|
| 251 |
+
<div class="document-wrapper glass-panel">
|
| 252 |
+
<h1>Quantum Compute Backend Specification</h1>
|
| 253 |
+
|
| 254 |
+
<h3>512-Dimensional Quantum State Architecture</h3>
|
| 255 |
+
<p>Operating on a 512-qubit system requires sophisticated memory management. The system uses a sparse representation strategy:</p>
|
| 256 |
+
|
| 257 |
+
<div class="specs-table">
|
| 258 |
+
<div class="spec-row header">
|
| 259 |
+
<span>Layer</span>
|
| 260 |
+
<span>Qubits</span>
|
| 261 |
+
<span>Storage Strategy</span>
|
| 262 |
+
<span>Approx. Size</span>
|
| 263 |
+
</div>
|
| 264 |
+
<div class="spec-row">
|
| 265 |
+
<span>Layer 1</span>
|
| 266 |
+
<span>0-12</span>
|
| 267 |
+
<span>Dense Vector</span>
|
| 268 |
+
<span>~16MB</span>
|
| 269 |
+
</div>
|
| 270 |
+
<div class="spec-row">
|
| 271 |
+
<span>Layer 2</span>
|
| 272 |
+
<span>13-24</span>
|
| 273 |
+
<span>Compressed Tensor</span>
|
| 274 |
+
<span>~256MB</span>
|
| 275 |
+
</div>
|
| 276 |
+
<div class="spec-row">
|
| 277 |
+
<span>Layer 3</span>
|
| 278 |
+
<span>25-512</span>
|
| 279 |
+
<span>Sparse + MPS (Adaptive)</span>
|
| 280 |
+
<span>Variable</span>
|
| 281 |
+
</div>
|
| 282 |
+
</div>
|
| 283 |
+
|
| 284 |
+
<h3>Qiskit.js Integration</h3>
|
| 285 |
+
<p>The <strong>QuantumComputeEngine</strong> class orchestrates the backend:</p>
|
| 286 |
+
<ul class="process-list">
|
| 287 |
+
<li><strong>Initialization:</strong> Sets up WebGPU device and 16-bit floating point support.</li>
|
| 288 |
+
<li><strong>Circuit Building:</strong> Constructs circuits via Qiskit.js API (Gates: H, CNOT, RY, etc.).</li>
|
| 289 |
+
<li><strong>Compilation:</strong> Generates optimized WGSL shaders for the specific gate sequence.</li>
|
| 290 |
+
<li><strong>Execution:</strong> Dispatches compute shaders to GPU for parallel state evolution.</li>
|
| 291 |
+
<li><strong>Analysis:</strong> Extracts metrics (Entropy, Fidelity) for the Bridge.</li>
|
| 292 |
+
</ul>
|
| 293 |
+
</div>
|
| 294 |
+
</section>
|
| 295 |
+
|
| 296 |
+
<!-- Detailed Specs / Development -->
|
| 297 |
+
<section id="specs" class="section">
|
| 298 |
+
<div class="document-wrapper glass-panel">
|
| 299 |
+
<h1>Development & Optimization</h1>
|
| 300 |
+
|
| 301 |
+
<div class="two-col">
|
| 302 |
+
<div>
|
| 303 |
+
<h3>Quantum-Classical Bridge</h3>
|
| 304 |
+
<p>The critical link between quantum randomness and creative output.</p>
|
| 305 |
+
<ul>
|
| 306 |
+
<li><strong>Feature Extraction:</strong> Pulls entropy and phase data from the quantum state.</li>
|
| 307 |
+
<li><strong>Adaptive Sampling:</strong> Dynamically adjusts quantum influence based on generation complexity.</li>
|
| 308 |
+
<li><strong>Injection Pipeline:</strong> Modifies Text Encoder Latents, modulates Diffusion Noise, or alters VAE Decoding.</li>
|
| 309 |
+
</ul>
|
| 310 |
+
</div>
|
| 311 |
+
<div>
|
| 312 |
+
<h3>Optimization Strategies</h3>
|
| 313 |
+
<ul>
|
| 314 |
+
<li><strong>WebGPU Acceleration:</strong> Parallel matrix operations on consumer GPUs.</li>
|
| 315 |
+
<li><strong>Circuit Decomposition:</strong> simplifying gates before compilation.</li>
|
| 316 |
+
<li><strong>Hybrid Inference:</strong> Interleaving CPU classical tasks with GPU quantum tasks.</li>
|
| 317 |
+
<li><strong>ONNX Runtime:</strong> Optimized execution for the classical diffusion model.</li>
|
| 318 |
+
</ul>
|
| 319 |
+
</div>
|
| 320 |
+
</div>
|
| 321 |
+
|
| 322 |
+
<h3>Phases of Development</h3>
|
| 323 |
+
<div class="timeline">
|
| 324 |
+
<div class="timeline-item">
|
| 325 |
+
<div class="phase">Phase 1</div>
|
| 326 |
+
<div class="desc"><strong>Environment & PoC:</strong> Setup Qiskit.js, WebGPU, and basic pipeline.</div>
|
| 327 |
+
</div>
|
| 328 |
+
<div class="timeline-item">
|
| 329 |
+
<div class="phase">Phase 2</div>
|
| 330 |
+
<div class="desc"><strong>Circuit Integration:</strong> Implement circuit templates and WGSL shader generation.</div>
|
| 331 |
+
</div>
|
| 332 |
+
<div class="timeline-item">
|
| 333 |
+
<div class="phase">Phase 3</div>
|
| 334 |
+
<div class="desc"><strong>Classical Backend:</strong> Integrate WAN 2.1 model via ONNX Runtime/Node.js.</div>
|
| 335 |
+
</div>
|
| 336 |
+
<div class="timeline-item">
|
| 337 |
+
<div class="phase">Phase 4</div>
|
| 338 |
+
<div class="desc"><strong>The Bridge:</strong> Implement parameter injection logic and adaptive sampling.</div>
|
| 339 |
+
</div>
|
| 340 |
+
<div class="timeline-item">
|
| 341 |
+
<div class="phase">Phase 5</div>
|
| 342 |
+
<div class="desc"><strong>UI/UX:</strong> Build the visualization dashboard and interactivity.</div>
|
| 343 |
+
</div>
|
| 344 |
+
</div>
|
| 345 |
+
</div>
|
| 346 |
+
</section>
|
| 347 |
+
</main>
|
| 348 |
+
</div>
|
| 349 |
+
|
| 350 |
+
<script type="module" src="app.js"></script>
|
| 351 |
+
<script type="module" src="architecture-diagram.js"></script>
|
| 352 |
+
<script type="module" src="quantum-viz.js"></script>
|
| 353 |
+
<script type="module" src="simulator.js"></script>
|
| 354 |
+
</body>
|
| 355 |
+
</html>
|
quantum-viz.js
ADDED
|
@@ -0,0 +1,327 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
// Enhanced Quantum Visualizations with Vivid Effects
|
| 2 |
+
|
| 3 |
+
class QuantumCircuitViz {
|
| 4 |
+
constructor() {
|
| 5 |
+
this.container = document.getElementById('view-circuit');
|
| 6 |
+
this.canvas = document.getElementById('quantum-circuit-canvas');
|
| 7 |
+
this.ctx = this.canvas?.getContext('2d');
|
| 8 |
+
|
| 9 |
+
this.qubits = 8; // Fixed for visualization
|
| 10 |
+
this.gates = [];
|
| 11 |
+
this.photons = [];
|
| 12 |
+
this.currentInfluence = 0;
|
| 13 |
+
this.currentDepth = 0;
|
| 14 |
+
|
| 15 |
+
this.generateRandomGates(0); // Initial generation
|
| 16 |
+
|
| 17 |
+
// Observe visibility changes for proper initialization
|
| 18 |
+
if (this.container) {
|
| 19 |
+
const observer = new ResizeObserver(() => this.resize());
|
| 20 |
+
observer.observe(this.container);
|
| 21 |
+
}
|
| 22 |
+
|
| 23 |
+
this.resize();
|
| 24 |
+
this.animate(0);
|
| 25 |
+
}
|
| 26 |
+
|
| 27 |
+
resize() {
|
| 28 |
+
if (!this.container || !this.canvas) return;
|
| 29 |
+
const rect = this.container.getBoundingClientRect();
|
| 30 |
+
if (rect.width > 0 && rect.height > 0) {
|
| 31 |
+
this.canvas.width = rect.width;
|
| 32 |
+
this.canvas.height = rect.height;
|
| 33 |
+
}
|
| 34 |
+
}
|
| 35 |
+
|
| 36 |
+
generateRandomGates(depthModifier = 0) {
|
| 37 |
+
this.gates = [];
|
| 38 |
+
const types = ['H', 'X', 'Y', 'Z', 'CNOT', 'P', 'T', 'S'];
|
| 39 |
+
const baseSteps = 10;
|
| 40 |
+
// Increase complexity based on depth
|
| 41 |
+
const steps = Math.min(20, baseSteps + Math.floor(depthModifier / 2));
|
| 42 |
+
|
| 43 |
+
for(let i=0; i<steps; i++) {
|
| 44 |
+
let col = [];
|
| 45 |
+
let usedQubits = new Set();
|
| 46 |
+
|
| 47 |
+
for(let q=0; q<this.qubits; q++) {
|
| 48 |
+
if (usedQubits.has(q)) continue;
|
| 49 |
+
|
| 50 |
+
// Influence can increase the chance of gates appearing
|
| 51 |
+
if (Math.random() > (0.6 - (this.currentInfluence / 200))) { // More gates with higher influence
|
| 52 |
+
const type = types[Math.floor(Math.random() * types.length)];
|
| 53 |
+
|
| 54 |
+
if (type === 'CNOT' && q < this.qubits - 1) {
|
| 55 |
+
// Control bit
|
| 56 |
+
col.push({ type: '•', qubit: q, target: q+1 });
|
| 57 |
+
// Target bit (virtual gate for viz)
|
| 58 |
+
col.push({ type: '⊕', qubit: q+1 });
|
| 59 |
+
usedQubits.add(q);
|
| 60 |
+
usedQubits.add(q+1);
|
| 61 |
+
} else {
|
| 62 |
+
col.push({ type: type, qubit: q });
|
| 63 |
+
usedQubits.add(q);
|
| 64 |
+
}
|
| 65 |
+
}
|
| 66 |
+
}
|
| 67 |
+
this.gates.push(col);
|
| 68 |
+
}
|
| 69 |
+
}
|
| 70 |
+
|
| 71 |
+
// New method to update influence and depth
|
| 72 |
+
updateVizParameters(influence, depth) {
|
| 73 |
+
this.currentInfluence = influence;
|
| 74 |
+
this.currentDepth = depth;
|
| 75 |
+
// Regenerate gates with new depth influence
|
| 76 |
+
this.generateRandomGates(depth);
|
| 77 |
+
// Trigger a burst of activity based on influence
|
| 78 |
+
const photonBurst = Math.floor(influence / 10);
|
| 79 |
+
for(let i=0; i<photonBurst; i++) {
|
| 80 |
+
this.spawnPhoton();
|
| 81 |
+
}
|
| 82 |
+
}
|
| 83 |
+
|
| 84 |
+
randomize() { // Keep for backward compatibility if simulator still calls it, but use updateVizParameters
|
| 85 |
+
this.generateRandomGates(this.currentDepth);
|
| 86 |
+
for(let i=0; i<8; i++) {
|
| 87 |
+
this.spawnPhoton();
|
| 88 |
+
}
|
| 89 |
+
}
|
| 90 |
+
|
| 91 |
+
spawnPhoton() {
|
| 92 |
+
this.photons.push({
|
| 93 |
+
x: 0,
|
| 94 |
+
y: Math.floor(Math.random() * this.qubits),
|
| 95 |
+
speed: (0.02 + Math.random() * 0.04) * (1 + this.currentInfluence / 100), // Faster with more influence
|
| 96 |
+
color: Math.random() > 0.5 ? '#00f0ff' : '#7000ff',
|
| 97 |
+
size: 2 + Math.random() * 3 * (1 + this.currentInfluence / 200) // Larger with more influence
|
| 98 |
+
});
|
| 99 |
+
}
|
| 100 |
+
|
| 101 |
+
draw(time) {
|
| 102 |
+
if(!this.ctx) return;
|
| 103 |
+
const w = this.canvas.width;
|
| 104 |
+
const h = this.canvas.height;
|
| 105 |
+
|
| 106 |
+
this.ctx.fillStyle = '#050510';
|
| 107 |
+
this.ctx.fillRect(0, 0, w, h);
|
| 108 |
+
|
| 109 |
+
const rowH = h / (this.qubits + 1);
|
| 110 |
+
const colW = w / (this.gates.length + 1);
|
| 111 |
+
|
| 112 |
+
// Draw Qubit Wires
|
| 113 |
+
this.ctx.shadowBlur = 0;
|
| 114 |
+
for(let i=0; i<this.qubits; i++) {
|
| 115 |
+
const y = rowH * (i + 1);
|
| 116 |
+
|
| 117 |
+
this.ctx.beginPath();
|
| 118 |
+
this.ctx.strokeStyle = 'rgba(255, 255, 255, 0.1)';
|
| 119 |
+
this.ctx.lineWidth = 1;
|
| 120 |
+
this.ctx.moveTo(40, y);
|
| 121 |
+
this.ctx.lineTo(w - 20, y);
|
| 122 |
+
this.ctx.stroke();
|
| 123 |
+
|
| 124 |
+
// Label
|
| 125 |
+
this.ctx.fillStyle = 'rgba(0, 240, 255, 0.8)';
|
| 126 |
+
this.ctx.font = '11px "Space Mono"';
|
| 127 |
+
this.ctx.fillText(`|0⟩`, 10, y + 4);
|
| 128 |
+
}
|
| 129 |
+
|
| 130 |
+
// Update and Draw Photons (Data flow)
|
| 131 |
+
// Spawn rate influenced by 'influence'
|
| 132 |
+
if (Math.random() > (0.92 - this.currentInfluence / 500)) this.spawnPhoton();
|
| 133 |
+
|
| 134 |
+
this.photons.forEach((p, idx) => {
|
| 135 |
+
p.x += p.speed;
|
| 136 |
+
const px = 40 + p.x * (w - 60);
|
| 137 |
+
const py = rowH * (p.y + 1);
|
| 138 |
+
|
| 139 |
+
this.ctx.shadowBlur = 10 * (1 + this.currentInfluence / 100); // Larger glow with more influence
|
| 140 |
+
this.ctx.shadowColor = p.color;
|
| 141 |
+
this.ctx.fillStyle = p.color;
|
| 142 |
+
|
| 143 |
+
this.ctx.beginPath();
|
| 144 |
+
this.ctx.arc(px, py, p.size, 0, Math.PI*2);
|
| 145 |
+
this.ctx.fill();
|
| 146 |
+
|
| 147 |
+
if (p.x > 1) this.photons.splice(idx, 1);
|
| 148 |
+
});
|
| 149 |
+
|
| 150 |
+
// Draw Gates
|
| 151 |
+
this.gates.forEach((col, xIndex) => {
|
| 152 |
+
const x = 60 + xIndex * colW;
|
| 153 |
+
|
| 154 |
+
col.forEach(gate => {
|
| 155 |
+
const y = rowH * (gate.qubit + 1);
|
| 156 |
+
|
| 157 |
+
this.ctx.shadowBlur = 12 * (1 + this.currentInfluence / 100); // Larger glow with more influence
|
| 158 |
+
this.ctx.shadowColor = 'rgba(112, 0, 255, 0.4)';
|
| 159 |
+
|
| 160 |
+
if (gate.type === '•') {
|
| 161 |
+
// Control Dot
|
| 162 |
+
this.ctx.fillStyle = '#fff';
|
| 163 |
+
this.ctx.beginPath();
|
| 164 |
+
this.ctx.arc(x, y, 4, 0, Math.PI*2);
|
| 165 |
+
this.ctx.fill();
|
| 166 |
+
|
| 167 |
+
// Line to target
|
| 168 |
+
this.ctx.strokeStyle = '#fff';
|
| 169 |
+
this.ctx.lineWidth = 2;
|
| 170 |
+
this.ctx.beginPath();
|
| 171 |
+
this.ctx.moveTo(x, y);
|
| 172 |
+
this.ctx.lineTo(x, rowH * (gate.target + 1));
|
| 173 |
+
this.ctx.stroke();
|
| 174 |
+
}
|
| 175 |
+
else if (gate.type === '⊕') {
|
| 176 |
+
// Target CNOT
|
| 177 |
+
this.ctx.fillStyle = '#000';
|
| 178 |
+
this.ctx.strokeStyle = '#fff';
|
| 179 |
+
this.ctx.lineWidth = 2;
|
| 180 |
+
this.ctx.beginPath();
|
| 181 |
+
this.ctx.arc(x, y, 8, 0, Math.PI*2);
|
| 182 |
+
this.ctx.fill();
|
| 183 |
+
this.ctx.stroke();
|
| 184 |
+
// Plus
|
| 185 |
+
this.ctx.beginPath();
|
| 186 |
+
this.ctx.moveTo(x-5, y);
|
| 187 |
+
this.ctx.lineTo(x+5, y);
|
| 188 |
+
this.ctx.moveTo(x, y-5);
|
| 189 |
+
this.ctx.lineTo(x, y+5);
|
| 190 |
+
this.ctx.stroke();
|
| 191 |
+
}
|
| 192 |
+
else {
|
| 193 |
+
// Standard Gate
|
| 194 |
+
const isH = gate.type === 'H';
|
| 195 |
+
this.ctx.fillStyle = isH ? '#7000ff' : '#050510';
|
| 196 |
+
this.ctx.strokeStyle = '#00f0ff';
|
| 197 |
+
this.ctx.shadowColor = isH ? '#7000ff' : '#00f0ff';
|
| 198 |
+
|
| 199 |
+
this.ctx.fillRect(x-12, y-12, 24, 24);
|
| 200 |
+
this.ctx.strokeRect(x-12, y-12, 24, 24);
|
| 201 |
+
|
| 202 |
+
this.ctx.fillStyle = '#fff';
|
| 203 |
+
this.ctx.textAlign = 'center';
|
| 204 |
+
this.ctx.font = 'bold 12px "Space Mono"';
|
| 205 |
+
this.ctx.shadowBlur = 0;
|
| 206 |
+
this.ctx.fillText(gate.type, x, y+4);
|
| 207 |
+
}
|
| 208 |
+
});
|
| 209 |
+
});
|
| 210 |
+
}
|
| 211 |
+
|
| 212 |
+
animate(time) {
|
| 213 |
+
this.draw(time);
|
| 214 |
+
// Auto-correction for size if container changed while hidden
|
| 215 |
+
if (this.container && this.canvas && this.container.clientWidth !== this.canvas.width && this.container.clientWidth > 0) {
|
| 216 |
+
this.resize();
|
| 217 |
+
}
|
| 218 |
+
requestAnimationFrame((t) => this.animate(t));
|
| 219 |
+
}
|
| 220 |
+
}
|
| 221 |
+
|
| 222 |
+
class StateVectorViz {
|
| 223 |
+
constructor() {
|
| 224 |
+
this.container = document.getElementById('view-state');
|
| 225 |
+
this.canvas = document.getElementById('state-vector-canvas');
|
| 226 |
+
this.ctx = this.canvas?.getContext('2d');
|
| 227 |
+
|
| 228 |
+
// Complex amplitudes simulation (Magnitude + Phase)
|
| 229 |
+
this.bins = 64;
|
| 230 |
+
this.magnitudes = new Array(this.bins).fill(0).map(() => Math.random() * 0.4);
|
| 231 |
+
this.phases = new Array(this.bins).fill(0).map(() => Math.random() * Math.PI * 2);
|
| 232 |
+
this.currentInfluence = 0;
|
| 233 |
+
this.currentDepth = 0;
|
| 234 |
+
|
| 235 |
+
if (this.container) {
|
| 236 |
+
new ResizeObserver(() => this.resize()).observe(this.container);
|
| 237 |
+
}
|
| 238 |
+
|
| 239 |
+
this.resize();
|
| 240 |
+
this.animate(0);
|
| 241 |
+
}
|
| 242 |
+
|
| 243 |
+
resize() {
|
| 244 |
+
if (!this.container || !this.canvas) return;
|
| 245 |
+
const rect = this.container.getBoundingClientRect();
|
| 246 |
+
if (rect.width > 0 && rect.height > 0) {
|
| 247 |
+
this.canvas.width = rect.width;
|
| 248 |
+
this.canvas.height = rect.height;
|
| 249 |
+
}
|
| 250 |
+
}
|
| 251 |
+
|
| 252 |
+
updateVizParameters(influence, depth) {
|
| 253 |
+
this.currentInfluence = influence;
|
| 254 |
+
this.currentDepth = depth;
|
| 255 |
+
// Trigger a burst of energy based on influence
|
| 256 |
+
const spikeBurst = Math.floor(influence / 20);
|
| 257 |
+
for(let i=0; i<spikeBurst; i++) {
|
| 258 |
+
this.spike();
|
| 259 |
+
}
|
| 260 |
+
}
|
| 261 |
+
|
| 262 |
+
spike() {
|
| 263 |
+
// Add energy to system, more intense with higher influence
|
| 264 |
+
const spikeAmount = 0.5 * (1 + this.currentInfluence / 50);
|
| 265 |
+
for(let i=0; i<10; i++) {
|
| 266 |
+
const idx = Math.floor(Math.random() * this.bins);
|
| 267 |
+
this.magnitudes[idx] = Math.min(1.0, this.magnitudes[idx] + spikeAmount);
|
| 268 |
+
}
|
| 269 |
+
}
|
| 270 |
+
|
| 271 |
+
draw(time) {
|
| 272 |
+
if(!this.ctx) return;
|
| 273 |
+
const w = this.canvas.width;
|
| 274 |
+
const h = this.canvas.height;
|
| 275 |
+
|
| 276 |
+
this.ctx.clearRect(0, 0, w, h);
|
| 277 |
+
|
| 278 |
+
// Grid
|
| 279 |
+
this.ctx.strokeStyle = 'rgba(255,255,255,0.05)';
|
| 280 |
+
this.ctx.lineWidth = 1;
|
| 281 |
+
this.ctx.beginPath();
|
| 282 |
+
for(let x=0; x<w; x+=40) { this.ctx.moveTo(x,0); this.ctx.lineTo(x,h); }
|
| 283 |
+
this.ctx.stroke();
|
| 284 |
+
|
| 285 |
+
const barW = w / this.bins;
|
| 286 |
+
|
| 287 |
+
for(let i=0; i<this.bins; i++) {
|
| 288 |
+
// Physics update
|
| 289 |
+
// Decay faster with higher influence
|
| 290 |
+
const decayRate = 0.98 - (this.currentInfluence / 2000);
|
| 291 |
+
this.magnitudes[i] = Math.max(0.05, this.magnitudes[i] * decayRate);
|
| 292 |
+
// Random fluctuation (Quantum foam) more pronounced with higher influence
|
| 293 |
+
if (Math.random() > (0.9 - this.currentInfluence / 1000)) this.magnitudes[i] += 0.05 * (1 + this.currentInfluence / 100);
|
| 294 |
+
|
| 295 |
+
// Phase fluctuation more chaotic with higher depth and influence
|
| 296 |
+
this.phases[i] += (Math.random() - 0.5) * (0.2 + this.currentInfluence / 500 + this.currentDepth / 100);
|
| 297 |
+
|
| 298 |
+
const mag = this.magnitudes[i];
|
| 299 |
+
const barH = mag * h * 0.9;
|
| 300 |
+
const x = i * barW;
|
| 301 |
+
const y = h - barH;
|
| 302 |
+
|
| 303 |
+
// Color based on index (frequency) + magnitude intensity
|
| 304 |
+
// Map index to Hue: 200 (Blue) -> 300 (Purple)
|
| 305 |
+
const hue = 200 + (i / this.bins) * 10;
|
| 306 |
+
const color = `hsl(${hue}, 100%, 50%)`;
|
| 307 |
+
|
| 308 |
+
this.ctx.fillStyle = color;
|
| 309 |
+
this.ctx.fillRect(x, y, barW-1, barH);
|
| 310 |
+
}
|
| 311 |
+
|
| 312 |
+
// Occasional random update, more frequent with influence/depth
|
| 313 |
+
if (Math.random() > (0.95 - this.currentInfluence / 1000 - this.currentDepth / 500)) {
|
| 314 |
+
const idx = Math.floor(Math.random() * this.bins);
|
| 315 |
+
this.magnitudes[idx] = Math.random();
|
| 316 |
+
}
|
| 317 |
+
}
|
| 318 |
+
|
| 319 |
+
animate(time) {
|
| 320 |
+
this.draw(time);
|
| 321 |
+
requestAnimationFrame(() => this.animate());
|
| 322 |
+
}
|
| 323 |
+
}
|
| 324 |
+
|
| 325 |
+
// Expose globally for simulator to trigger
|
| 326 |
+
window.circuitViz = new QuantumCircuitViz();
|
| 327 |
+
window.stateViz = new StateVectorViz();
|
requirements.txt
ADDED
|
@@ -0,0 +1,8 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
transformers
|
| 2 |
+
torch
|
| 3 |
+
Pillow
|
| 4 |
+
Flask
|
| 5 |
+
flask-cors
|
| 6 |
+
mlc-llm-nightly
|
| 7 |
+
sentencepiece
|
| 8 |
+
outlines
|
simulator.js
ADDED
|
@@ -0,0 +1,644 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
// Advanced Simulation Controller
|
| 2 |
+
|
| 3 |
+
const BACKEND_URL = window.location.origin; // Dynamically set backend URL for HuggingFace Spaces compatibility
|
| 4 |
+
|
| 5 |
+
class SystemSimulator {
|
| 6 |
+
constructor() {
|
| 7 |
+
this.logs = document.getElementById('system-logs');
|
| 8 |
+
this.outputCanvas = document.getElementById('output-canvas');
|
| 9 |
+
this.outputCtx = this.outputCanvas?.getContext('2d');
|
| 10 |
+
|
| 11 |
+
this.isGenerating = false;
|
| 12 |
+
this.sourceImage = null;
|
| 13 |
+
this.config = {
|
| 14 |
+
prompt: '',
|
| 15 |
+
influence: 5, // Default 5%
|
| 16 |
+
depth: 16, // Default 16 layers
|
| 17 |
+
method: 'adaptive'
|
| 18 |
+
};
|
| 19 |
+
|
| 20 |
+
// Director Mode State
|
| 21 |
+
this.directorMode = true;
|
| 22 |
+
this.movieFrames = []; // Stores ImageBitmaps or DataURLs
|
| 23 |
+
this.accumulatedFrames = 0;
|
| 24 |
+
|
| 25 |
+
this.init();
|
| 26 |
+
}
|
| 27 |
+
|
| 28 |
+
async callBackendApi(endpoint, data) {
|
| 29 |
+
try {
|
| 30 |
+
const response = await fetch(`${BACKEND_URL}${endpoint}`, {
|
| 31 |
+
method: 'POST',
|
| 32 |
+
headers: {
|
| 33 |
+
'Content-Type': 'application/json',
|
| 34 |
+
},
|
| 35 |
+
body: JSON.stringify(data),
|
| 36 |
+
});
|
| 37 |
+
const jsonResponse = await response.json();
|
| 38 |
+
if (!response.ok) {
|
| 39 |
+
throw new Error(jsonResponse.error || `Backend error: ${response.statusText}`);
|
| 40 |
+
}
|
| 41 |
+
return jsonResponse;
|
| 42 |
+
} catch (error) {
|
| 43 |
+
this.log(`Backend API Error (${endpoint}): ${error.message}`, 'error');
|
| 44 |
+
console.error(`Backend API Error (${endpoint}):`, error);
|
| 45 |
+
throw error; // Re-throw to be caught by the calling function
|
| 46 |
+
}
|
| 47 |
+
}
|
| 48 |
+
|
| 49 |
+
init() {
|
| 50 |
+
this.setupListeners();
|
| 51 |
+
this.resizeCanvas();
|
| 52 |
+
window.addEventListener('resize', () => this.resizeCanvas());
|
| 53 |
+
|
| 54 |
+
// Initial visual state
|
| 55 |
+
this.drawStaticNoise();
|
| 56 |
+
}
|
| 57 |
+
|
| 58 |
+
setupListeners() {
|
| 59 |
+
// Image Upload Handling
|
| 60 |
+
const dropZone = document.getElementById('drop-zone');
|
| 61 |
+
const fileInput = document.getElementById('image-input');
|
| 62 |
+
|
| 63 |
+
dropZone.addEventListener('click', () => fileInput.click());
|
| 64 |
+
|
| 65 |
+
dropZone.addEventListener('dragover', (e) => {
|
| 66 |
+
e.preventDefault();
|
| 67 |
+
dropZone.classList.add('drag-over');
|
| 68 |
+
});
|
| 69 |
+
|
| 70 |
+
dropZone.addEventListener('dragleave', () => {
|
| 71 |
+
dropZone.classList.remove('drag-over');
|
| 72 |
+
});
|
| 73 |
+
|
| 74 |
+
dropZone.addEventListener('drop', (e) => {
|
| 75 |
+
e.preventDefault();
|
| 76 |
+
dropZone.classList.remove('drag-over');
|
| 77 |
+
if(e.dataTransfer.files.length) {
|
| 78 |
+
this.handleImage(e.dataTransfer.files[0]);
|
| 79 |
+
}
|
| 80 |
+
});
|
| 81 |
+
|
| 82 |
+
fileInput.addEventListener('change', (e) => {
|
| 83 |
+
if(e.target.files.length) {
|
| 84 |
+
this.handleImage(e.target.files[0]);
|
| 85 |
+
}
|
| 86 |
+
});
|
| 87 |
+
|
| 88 |
+
// Director Mode Listeners
|
| 89 |
+
const directorToggle = document.getElementById('director-mode-toggle');
|
| 90 |
+
if (directorToggle) {
|
| 91 |
+
directorToggle.addEventListener('change', (e) => {
|
| 92 |
+
this.directorMode = e.target.checked;
|
| 93 |
+
this.log(`Director Mode: ${this.directorMode ? 'ENABLED' : 'DISABLED'}`, 'info');
|
| 94 |
+
});
|
| 95 |
+
}
|
| 96 |
+
|
| 97 |
+
document.getElementById('download-btn').addEventListener('click', () => this.downloadMovie());
|
| 98 |
+
document.getElementById('reset-movie-btn').addEventListener('click', () => this.resetMovie());
|
| 99 |
+
|
| 100 |
+
// Inputs
|
| 101 |
+
document.getElementById('quantum-influence').addEventListener('input', (e) => {
|
| 102 |
+
document.getElementById('influence-val').textContent = `${e.target.value}%`;
|
| 103 |
+
this.config.influence = parseInt(e.target.value);
|
| 104 |
+
});
|
| 105 |
+
|
| 106 |
+
document.getElementById('entanglement-depth').addEventListener('input', (e) => {
|
| 107 |
+
document.getElementById('depth-val').textContent = e.target.value;
|
| 108 |
+
this.config.depth = parseInt(e.target.value);
|
| 109 |
+
});
|
| 110 |
+
|
| 111 |
+
// Tabs
|
| 112 |
+
document.querySelectorAll('.viz-tab').forEach(tab => {
|
| 113 |
+
tab.addEventListener('click', () => {
|
| 114 |
+
document.querySelectorAll('.viz-tab').forEach(t => t.classList.remove('active'));
|
| 115 |
+
document.querySelectorAll('.viz-view').forEach(v => v.classList.remove('active'));
|
| 116 |
+
|
| 117 |
+
tab.classList.add('active');
|
| 118 |
+
document.getElementById(`view-${tab.dataset.view}`).classList.add('active');
|
| 119 |
+
});
|
| 120 |
+
});
|
| 121 |
+
|
| 122 |
+
// Start Button
|
| 123 |
+
document.getElementById('start-btn').addEventListener('click', () => this.startGeneration());
|
| 124 |
+
}
|
| 125 |
+
|
| 126 |
+
handleImage(file) {
|
| 127 |
+
const reader = new FileReader();
|
| 128 |
+
reader.onload = async (e) => { // Made onload async
|
| 129 |
+
this.sourceImage = new Image();
|
| 130 |
+
this.sourceImage.onload = async () => { // Made onload async
|
| 131 |
+
// Show preview
|
| 132 |
+
const preview = document.getElementById('preview-img');
|
| 133 |
+
preview.src = this.sourceImage.src;
|
| 134 |
+
preview.classList.remove('hidden');
|
| 135 |
+
document.querySelector('.drop-content').style.opacity = '0';
|
| 136 |
+
this.log(`Image loaded: ${file.name} (${this.sourceImage.width}x${this.sourceImage.height})`, 'success');
|
| 137 |
+
|
| 138 |
+
// Call backend for CLIP analysis
|
| 139 |
+
try {
|
| 140 |
+
await this.analyzeImageContext(e.target.result); // Pass dataURL directly
|
| 141 |
+
} catch (error) {
|
| 142 |
+
this.log(`Failed CLIP analysis for ${file.name}: ${error.message}`, 'error');
|
| 143 |
+
}
|
| 144 |
+
};
|
| 145 |
+
this.sourceImage.src = e.target.result;
|
| 146 |
+
};
|
| 147 |
+
reader.readAsDataURL(file);
|
| 148 |
+
}
|
| 149 |
+
|
| 150 |
+
async analyzeImageContext(imageDataURL) {
|
| 151 |
+
this.log('CLIP-Encoder: Sending image for feature extraction...', 'info');
|
| 152 |
+
try {
|
| 153 |
+
const response = await this.callBackendApi('/embed_image', { image: imageDataURL });
|
| 154 |
+
const embeddings = response.embeddings;
|
| 155 |
+
this.log(`CLIP-Encoder: Extracted feature vector [${embeddings[0].toFixed(4)}, ${embeddings[1].toFixed(4)}, ${embeddings[2].toFixed(4)}, ...]`, 'success');
|
| 156 |
+
} catch (error) {
|
| 157 |
+
this.log(`CLIP-Encoder: Failed to get embeddings. Is backend running? ${error.message}`, 'error');
|
| 158 |
+
throw error;
|
| 159 |
+
}
|
| 160 |
+
}
|
| 161 |
+
|
| 162 |
+
updateDirectorUI() {
|
| 163 |
+
document.getElementById('total-frames').textContent = `${this.movieFrames.length} FRAMES`;
|
| 164 |
+
document.getElementById('download-btn').disabled = this.movieFrames.length === 0;
|
| 165 |
+
document.getElementById('reset-movie-btn').disabled = this.movieFrames.length === 0;
|
| 166 |
+
}
|
| 167 |
+
|
| 168 |
+
resetMovie() {
|
| 169 |
+
this.movieFrames = [];
|
| 170 |
+
this.updateDirectorUI();
|
| 171 |
+
this.log('Director Mode: Timeline cleared.', 'warn');
|
| 172 |
+
}
|
| 173 |
+
|
| 174 |
+
resizeCanvas() {
|
| 175 |
+
if (!this.outputCanvas) return;
|
| 176 |
+
const rect = this.outputCanvas.parentElement.getBoundingClientRect();
|
| 177 |
+
this.outputCanvas.width = rect.width;
|
| 178 |
+
this.outputCanvas.height = rect.height;
|
| 179 |
+
if (!this.isGenerating) this.drawStaticNoise();
|
| 180 |
+
}
|
| 181 |
+
|
| 182 |
+
log(message, type = 'info') {
|
| 183 |
+
const div = document.createElement('div');
|
| 184 |
+
div.className = `log-line ${type}`;
|
| 185 |
+
const time = new Date().toLocaleTimeString('en-US', { hour12: false });
|
| 186 |
+
div.innerHTML = `<span class="ts">[${time}]</span> ${message}`;
|
| 187 |
+
this.logs.appendChild(div);
|
| 188 |
+
this.logs.scrollTop = this.logs.scrollHeight;
|
| 189 |
+
}
|
| 190 |
+
|
| 191 |
+
async startGeneration() {
|
| 192 |
+
if (this.isGenerating) return;
|
| 193 |
+
|
| 194 |
+
// Disable UI elements during generation
|
| 195 |
+
this.isGenerating = true;
|
| 196 |
+
document.getElementById('start-btn').disabled = true;
|
| 197 |
+
document.getElementById('prompt-input').disabled = true;
|
| 198 |
+
document.getElementById('image-input').disabled = true;
|
| 199 |
+
document.getElementById('quantum-influence').disabled = true;
|
| 200 |
+
document.getElementById('entanglement-depth').disabled = true;
|
| 201 |
+
document.getElementById('sampling-method').disabled = true;
|
| 202 |
+
|
| 203 |
+
document.getElementById('generation-stats').style.display = 'block';
|
| 204 |
+
|
| 205 |
+
try {
|
| 206 |
+
if (!this.sourceImage) {
|
| 207 |
+
this.log('Error: Source image required for I2V generation.', 'error');
|
| 208 |
+
alert("Please upload a source image first.");
|
| 209 |
+
return; // Exit if no source image
|
| 210 |
+
}
|
| 211 |
+
|
| 212 |
+
const prompt = document.getElementById('prompt-input').value.trim() || "Quantum interpolation";
|
| 213 |
+
|
| 214 |
+
// --- Backend Health Check ---
|
| 215 |
+
this.log('Checking backend availability...', 'info');
|
| 216 |
+
try {
|
| 217 |
+
const health = await this.callBackendApi('/');
|
| 218 |
+
this.log(`Backend Status: ${health.status} (LLM: ${health.llm_status}, CLIP: ${health.clip_status})`, 'success');
|
| 219 |
+
if (health.llm_status.includes("not loaded") || health.clip_status.includes("not loaded")) {
|
| 220 |
+
throw new Error("One or more AI models not loaded on backend. Check backend console.");
|
| 221 |
+
}
|
| 222 |
+
} catch (error) {
|
| 223 |
+
this.log(`Backend not available or unhealthy: ${error.message}. Please ensure your Python Flask backend is running.`, 'error');
|
| 224 |
+
alert(`Backend Error: ${error.message}. Please start the backend.`);
|
| 225 |
+
return; // Exit if backend is not healthy
|
| 226 |
+
}
|
| 227 |
+
// --- End Backend Health Check ---
|
| 228 |
+
|
| 229 |
+
this.log(`Initializing I2V pipeline for: "${prompt.substring(0, 30)}..."`, 'info');
|
| 230 |
+
|
| 231 |
+
// Phase 1: Initialization
|
| 232 |
+
await this.phaseInitialization();
|
| 233 |
+
|
| 234 |
+
// Phase 2: Quantum Circuit
|
| 235 |
+
await this.phaseQuantumCircuit();
|
| 236 |
+
|
| 237 |
+
// Phase 3: WebGPU Compute
|
| 238 |
+
await this.phaseWebGPU();
|
| 239 |
+
|
| 240 |
+
// Phase 4: Bridge & Diffusion (Real Emulation)
|
| 241 |
+
// This will now also handle recording if in Director Mode
|
| 242 |
+
await this.phaseRealDiffusion(prompt);
|
| 243 |
+
|
| 244 |
+
this.log('Generation Sequence Complete.', 'success');
|
| 245 |
+
document.getElementById('generation-stats').innerHTML = 'GENERATION COMPLETE';
|
| 246 |
+
|
| 247 |
+
// DIRECTOR MODE: PREP NEXT FRAME
|
| 248 |
+
if (this.directorMode && this.movieFrames.length > 0) {
|
| 249 |
+
this.prepareNextContext();
|
| 250 |
+
}
|
| 251 |
+
|
| 252 |
+
} catch (error) {
|
| 253 |
+
this.log(`System Error during generation: ${error.message}`, 'error');
|
| 254 |
+
document.getElementById('generation-stats').innerHTML = `ERROR: ${error.message}`;
|
| 255 |
+
console.error(error);
|
| 256 |
+
} finally {
|
| 257 |
+
// Re-enable UI elements
|
| 258 |
+
this.isGenerating = false;
|
| 259 |
+
document.getElementById('start-btn').disabled = false;
|
| 260 |
+
document.getElementById('prompt-input').disabled = false;
|
| 261 |
+
document.getElementById('image-input').disabled = false;
|
| 262 |
+
document.getElementById('quantum-influence').disabled = false;
|
| 263 |
+
document.getElementById('entanglement-depth').disabled = false;
|
| 264 |
+
document.getElementById('sampling-method').disabled = false;
|
| 265 |
+
}
|
| 266 |
+
}
|
| 267 |
+
|
| 268 |
+
prepareNextContext() {
|
| 269 |
+
// Get the last frame from the movie array
|
| 270 |
+
const lastFrameBitmap = this.movieFrames[this.movieFrames.length - 1];
|
| 271 |
+
|
| 272 |
+
// Create a temp canvas to extract the image
|
| 273 |
+
const canvas = document.createElement('canvas');
|
| 274 |
+
canvas.width = this.outputCanvas.width;
|
| 275 |
+
canvas.height = this.outputCanvas.height;
|
| 276 |
+
const ctx = canvas.getContext('2d');
|
| 277 |
+
ctx.drawImage(lastFrameBitmap, 0, 0);
|
| 278 |
+
|
| 279 |
+
// Convert to Image object for sourceImage
|
| 280 |
+
const newUrl = canvas.toDataURL();
|
| 281 |
+
const nextImg = new Image();
|
| 282 |
+
nextImg.onload = () => {
|
| 283 |
+
this.sourceImage = nextImg;
|
| 284 |
+
// Update Preview UI
|
| 285 |
+
const preview = document.getElementById('preview-img');
|
| 286 |
+
preview.src = newUrl;
|
| 287 |
+
this.log('Director Mode: Context refreshed. Last frame set as input for next sequence.', 'secondary');
|
| 288 |
+
};
|
| 289 |
+
nextImg.src = newUrl;
|
| 290 |
+
}
|
| 291 |
+
|
| 292 |
+
async sleep(ms) {
|
| 293 |
+
return new Promise(r => setTimeout(r, ms));
|
| 294 |
+
}
|
| 295 |
+
|
| 296 |
+
async phaseInitialization() {
|
| 297 |
+
this.log('Allocating WebGPU buffers for I2V tensor...', 'info');
|
| 298 |
+
await this.sleep(600);
|
| 299 |
+
this.log('Quantizing source image to 512-dim latent space...', 'info');
|
| 300 |
+
await this.sleep(800);
|
| 301 |
+
}
|
| 302 |
+
|
| 303 |
+
async phaseQuantumCircuit() {
|
| 304 |
+
this.log(`Constructing ${this.config.depth}-layer quantum circuit...`, 'info');
|
| 305 |
+
// Trigger Viz animation if available globally
|
| 306 |
+
if (window.circuitViz) window.circuitViz.updateVizParameters(this.config.influence, this.config.depth);
|
| 307 |
+
|
| 308 |
+
await this.sleep(1000);
|
| 309 |
+
this.log('Applying Hadamard gates to initialization layer...', 'info');
|
| 310 |
+
await this.sleep(400);
|
| 311 |
+
this.log(`Entangling qubits 0-511 with depth ${this.config.depth}...`, 'info');
|
| 312 |
+
await this.sleep(800);
|
| 313 |
+
}
|
| 314 |
+
|
| 315 |
+
async phaseWebGPU() {
|
| 316 |
+
this.log('Compiling circuit to WGSL shaders...', 'info');
|
| 317 |
+
await this.sleep(600);
|
| 318 |
+
this.log('Injecting quantum noise into CLIP embeddings...', 'info');
|
| 319 |
+
|
| 320 |
+
// Simulate intense calculation, trigger stateViz with parameters
|
| 321 |
+
if (window.stateViz) window.stateViz.updateVizParameters(this.config.influence, this.config.depth);
|
| 322 |
+
// Keep sleep for visual pacing
|
| 323 |
+
for (let i = 0; i < 5; i++) {
|
| 324 |
+
await this.sleep(200);
|
| 325 |
+
}
|
| 326 |
+
|
| 327 |
+
const entropy = (Math.random() * 3 + 0.5).toFixed(4);
|
| 328 |
+
document.getElementById('entropy-value').textContent = entropy;
|
| 329 |
+
this.log(`Latent perturbation complete. Entropy: ${entropy}`, 'success');
|
| 330 |
+
}
|
| 331 |
+
|
| 332 |
+
async phaseRealDiffusion(prompt) {
|
| 333 |
+
this.log('Starting Frame-by-Frame Quantum Diffusion...', 'warn');
|
| 334 |
+
|
| 335 |
+
// Switch tab to output to show the magic
|
| 336 |
+
document.querySelector('[data-view="output"]').click();
|
| 337 |
+
|
| 338 |
+
// Get initial image data from the source image
|
| 339 |
+
let currentImage = this.sourceImage;
|
| 340 |
+
const totalFrames = 48; // Total frames for the movie
|
| 341 |
+
let currentFrameDataURL = currentImage.src; // Data URL of the current frame
|
| 342 |
+
|
| 343 |
+
for (let frame = 0; frame < totalFrames; frame++) {
|
| 344 |
+
this.log(`Requesting guidance for Frame ${frame + 1}/${totalFrames}...`, 'info');
|
| 345 |
+
document.getElementById('generation-stats').innerHTML = `GETTING GUIDANCE FOR FRAME ${frame + 1}/${totalFrames}<br>Quantum-Diffusing...`;
|
| 346 |
+
|
| 347 |
+
// Call backend for LLM guidance on how to transform the current frame
|
| 348 |
+
const guidanceResponse = await this.callBackendApi('/generate_frame_guidance', {
|
| 349 |
+
image: currentFrameDataURL,
|
| 350 |
+
prompt: prompt,
|
| 351 |
+
influence: this.config.influence,
|
| 352 |
+
depth: this.config.depth,
|
| 353 |
+
frame_number: frame
|
| 354 |
+
});
|
| 355 |
+
|
| 356 |
+
const llmGuidance = guidanceResponse.guidance;
|
| 357 |
+
this.log(`LLM Guidance (Frame ${frame + 1}): ${llmGuidance.substring(0, 80)}...`, 'secondary');
|
| 358 |
+
|
| 359 |
+
document.getElementById('generation-stats').innerHTML = `RENDERING FRAME ${frame + 1}/${totalFrames}<br>Applying Quantum Effects...`;
|
| 360 |
+
|
| 361 |
+
// Render the next frame based on LLM guidance and current image
|
| 362 |
+
const newFrameDataURL = await this.renderFrameTransition(currentImage, this.config.influence, llmGuidance, frame);
|
| 363 |
+
|
| 364 |
+
// Update currentImage for the next iteration
|
| 365 |
+
currentImage = await this this.loadImageFromDataURL(newFrameDataURL);
|
| 366 |
+
currentFrameDataURL = newFrameDataURL; // Update dataURL as well
|
| 367 |
+
|
| 368 |
+
// Director Mode: Record Frame
|
| 369 |
+
if (this.directorMode) {
|
| 370 |
+
const bitmap = await createImageBitmap(this.outputCanvas);
|
| 371 |
+
this.movieFrames.push(bitmap);
|
| 372 |
+
this.updateDirectorUI();
|
| 373 |
+
}
|
| 374 |
+
|
| 375 |
+
await this.sleep(50); // Render speed
|
| 376 |
+
}
|
| 377 |
+
}
|
| 378 |
+
|
| 379 |
+
async loadImageFromDataURL(dataURL) {
|
| 380 |
+
return new Promise((resolve, reject) => {
|
| 381 |
+
const img = new Image();
|
| 382 |
+
img.onload = () => resolve(img);
|
| 383 |
+
img.onerror = reject;
|
| 384 |
+
img.src = dataURL;
|
| 385 |
+
});
|
| 386 |
+
}
|
| 387 |
+
|
| 388 |
+
async renderFrameTransition(currentImage, influence, llmGuidance, frameNumber) {
|
| 389 |
+
const w = this.outputCanvas.width;
|
| 390 |
+
const h = this.outputCanvas.height;
|
| 391 |
+
this.outputCtx.clearRect(0, 0, w, h); // Clear canvas for new frame
|
| 392 |
+
|
| 393 |
+
// Create a temporary canvas to draw the currentImage and apply effects
|
| 394 |
+
const tempCanvas = document.createElement('canvas');
|
| 395 |
+
tempCanvas.width = w;
|
| 396 |
+
tempCanvas.height = h;
|
| 397 |
+
const tempCtx = tempCanvas.getContext('2d');
|
| 398 |
+
|
| 399 |
+
// Draw the current image, scaled to fit
|
| 400 |
+
const aspectRatio = currentImage.width / currentImage.height;
|
| 401 |
+
let drawWidth = w;
|
| 402 |
+
let drawHeight = h;
|
| 403 |
+
if (w / h > aspectRatio) { // Canvas is wider than image
|
| 404 |
+
drawWidth = h * aspectRatio;
|
| 405 |
+
} else { // Canvas is taller than image
|
| 406 |
+
drawHeight = w / aspectRatio;
|
| 407 |
+
}
|
| 408 |
+
const offsetX = (w - drawWidth) / 2;
|
| 409 |
+
const offsetY = (h - drawHeight) / 2;
|
| 410 |
+
tempCtx.drawImage(currentImage, offsetX, offsetY, drawWidth, drawHeight);
|
| 411 |
+
|
| 412 |
+
// Get ImageData for pixel manipulation
|
| 413 |
+
let imageData = tempCtx.getImageData(0, 0, w, h);
|
| 414 |
+
let data = imageData.data;
|
| 415 |
+
|
| 416 |
+
// --- Parse LLM Guidance and apply effects ---
|
| 417 |
+
const instructions = llmGuidance.toLowerCase().split(',').map(s => s.trim());
|
| 418 |
+
let pixelShiftX = 0;
|
| 419 |
+
let pixelShiftY = 0;
|
| 420 |
+
let colorShiftR = 0;
|
| 421 |
+
let colorShiftG = 0;
|
| 422 |
+
let colorShiftB = 0;
|
| 423 |
+
let blurRadius = 0;
|
| 424 |
+
let zoomFactor = 1;
|
| 425 |
+
let staticOverlayOpacity = 0;
|
| 426 |
+
|
| 427 |
+
for (const instruction of instructions) {
|
| 428 |
+
if (instruction.includes("shift red by")) {
|
| 429 |
+
colorShiftR += parseInt(instruction.match(/by (-?\d+)/)?.[1] || "0");
|
| 430 |
+
} else if (instruction.includes("shift green by")) {
|
| 431 |
+
colorShiftG += parseInt(instruction.match(/by (-?\d+)/)?.[1] || "0");
|
| 432 |
+
} else if (instruction.includes("shift blue by")) {
|
| 433 |
+
colorShiftB += parseInt(instruction.match(/by (-?\d+)/)?.[1] || "0");
|
| 434 |
+
} else if (instruction.includes("pixel displacement x-axis")) {
|
| 435 |
+
pixelShiftX += parseInt(instruction.match(/random (-?\d+)px/)?.[1] || "0");
|
| 436 |
+
} else if (instruction.includes("pixel displacement y-axis")) {
|
| 437 |
+
pixelShiftY += parseInt(instruction.match(/random (-?\d+)px/)?.[1] || "0");
|
| 438 |
+
} else if (instruction.includes("apply gaussian blur radius")) {
|
| 439 |
+
blurRadius = Math.max(blurRadius, parseInt(instruction.match(/radius (\d+)/)?.[1] || "0"));
|
| 440 |
+
} else if (instruction.includes("zoom in")) {
|
| 441 |
+
zoomFactor *= (1 + parseFloat(instruction.match(/zoom in (\d+(\.\d+)?)/)?.[1] || "0"));
|
| 442 |
+
} else if (instruction.includes("zoom out")) {
|
| 443 |
+
zoomFactor /= (1 + parseFloat(instruction.match(/zoom out (\d+(\.\d+)?)/)?.[1] || "0"));
|
| 444 |
+
} else if (instruction.includes("static pattern opacity")) {
|
| 445 |
+
staticOverlayOpacity = Math.max(staticOverlayOpacity, parseFloat(instruction.match(/opacity (\d+(\.\d+)?)/)?.[1] || "0"));
|
| 446 |
+
}
|
| 447 |
+
// Add more parsing for other instructions...
|
| 448 |
+
}
|
| 449 |
+
|
| 450 |
+
// Apply pixel shifts and color changes
|
| 451 |
+
const tempImageData = tempCtx.createImageData(w, h);
|
| 452 |
+
const tempData = tempImageData.data;
|
| 453 |
+
|
| 454 |
+
for (let y = 0; y < h; y++) {
|
| 455 |
+
for (let x = 0; x < w; x++) {
|
| 456 |
+
const originalIndex = (y * w + x) * 4;
|
| 457 |
+
|
| 458 |
+
const shiftedX = (x - pixelShiftX + w) % w;
|
| 459 |
+
const shiftedY = (y - pixelShiftY + h) % h;
|
| 460 |
+
const shiftedIndex = (shiftedY * w + shiftedX) * 4;
|
| 461 |
+
|
| 462 |
+
if (shiftedIndex >= 0 && shiftedIndex < data.length) {
|
| 463 |
+
tempData[originalIndex] = Math.min(255, Math.max(0, data[shiftedIndex] + colorShiftR)); // Red
|
| 464 |
+
tempData[originalIndex + 1] = Math.min(255, Math.max(0, data[shiftedIndex + 1] + colorShiftG)); // Green
|
| 465 |
+
tempData[originalIndex + 2] = Math.min(255, Math.max(0, data[shiftedIndex + 2] + colorShiftB)); // Blue
|
| 466 |
+
tempData[originalIndex + 3] = data[shiftedIndex + 3]; // Alpha
|
| 467 |
+
} else {
|
| 468 |
+
// Handle out-of-bounds pixels (e.g., fill with black or transparent)
|
| 469 |
+
tempData[originalIndex] = 0;
|
| 470 |
+
tempData[originalIndex + 1] = 0;
|
| 471 |
+
tempData[originalIndex + 2] = 0;
|
| 472 |
+
tempData[originalIndex + 3] = 255;
|
| 473 |
+
}
|
| 474 |
+
}
|
| 475 |
+
}
|
| 476 |
+
imageData = tempImageData; // Update imageData with shifted pixels
|
| 477 |
+
|
| 478 |
+
// Apply blur (very basic box blur for performance, Gaussian is complex with pixel data)
|
| 479 |
+
if (blurRadius > 0) {
|
| 480 |
+
const blurredImageData = tempCtx.createImageData(w, h);
|
| 481 |
+
const blurredData = blurredImageData.data;
|
| 482 |
+
for (let y = 0; y < h; y++) {
|
| 483 |
+
for (let x = 0; x < w; x++) {
|
| 484 |
+
let rSum = 0, gSum = 0, bSum = 0, aSum = 0;
|
| 485 |
+
let count = 0;
|
| 486 |
+
for (let ky = -blurRadius; ky <= blurRadius; ky++) {
|
| 487 |
+
for (let kx = -blurRadius; kx <= blurRadius; kx++) {
|
| 488 |
+
const nx = x + kx;
|
| 489 |
+
const ny = y + ky;
|
| 490 |
+
if (nx >= 0 && nx < w && ny >= 0 && ny < h) {
|
| 491 |
+
const index = (ny * w + nx) * 4;
|
| 492 |
+
rSum += data[index];
|
| 493 |
+
gSum += data[index + 1];
|
| 494 |
+
bSum += data[index + 2];
|
| 495 |
+
aSum += data[index + 3];
|
| 496 |
+
count++;
|
| 497 |
+
}
|
| 498 |
+
}
|
| 499 |
+
}
|
| 500 |
+
const outputIndex = (y * w + x) * 4;
|
| 501 |
+
blurredData[outputIndex] = rSum / count;
|
| 502 |
+
blurredData[outputIndex + 1] = gSum / count;
|
| 503 |
+
blurredData[outputIndex + 2] = bSum / count;
|
| 504 |
+
blurredData[outputIndex + 3] = aSum / count;
|
| 505 |
+
}
|
| 506 |
+
}
|
| 507 |
+
imageData = blurredImageData;
|
| 508 |
+
}
|
| 509 |
+
|
| 510 |
+
// Apply static overlay
|
| 511 |
+
if (staticOverlayOpacity > 0) {
|
| 512 |
+
for (let i = 0; i < imageData.data.length; i += 4) {
|
| 513 |
+
const staticValue = Math.random() * 255;
|
| 514 |
+
imageData.data[i] = (imageData.data[i] * (1 - staticOverlayOpacity)) + (staticValue * staticOverlayOpacity);
|
| 515 |
+
imageData.data[i+1] = (imageData.data[i+1] * (1 - staticOverlayOpacity)) + (staticValue * staticOverlayOpacity);
|
| 516 |
+
imageData.data[i+2] = (imageData.data[i+2] * (1 - staticOverlayOpacity)) + (staticValue * staticOverlayOpacity);
|
| 517 |
+
}
|
| 518 |
+
}
|
| 519 |
+
|
| 520 |
+
// Draw the processed ImageData back to the temporary canvas
|
| 521 |
+
tempCtx.putImageData(imageData, 0, 0);
|
| 522 |
+
|
| 523 |
+
// Apply zoom (done by redrawing tempCanvas onto outputCanvas)
|
| 524 |
+
const zoomedWidth = w * zoomFactor;
|
| 525 |
+
const zoomedHeight = h * zoomFactor;
|
| 526 |
+
const zoomOffsetX = (w - zoomedWidth) / 2;
|
| 527 |
+
const zoomOffsetY = (h - zoomedHeight) / 2;
|
| 528 |
+
|
| 529 |
+
this.outputCtx.drawImage(tempCanvas, zoomOffsetX, zoomOffsetY, zoomedWidth, zoomedHeight);
|
| 530 |
+
|
| 531 |
+
// Periodically draw circuit overlay if influence is high
|
| 532 |
+
if (influence > 50 && frameNumber % 5 === 0) {
|
| 533 |
+
this.drawCircuitOverlay();
|
| 534 |
+
}
|
| 535 |
+
|
| 536 |
+
// Convert the final rendered canvas state to a DataURL for the next iteration
|
| 537 |
+
return this.outputCanvas.toDataURL();
|
| 538 |
+
}
|
| 539 |
+
|
| 540 |
+
drawCircuitOverlay() {
|
| 541 |
+
const ctx = this.outputCtx;
|
| 542 |
+
const w = this.outputCanvas.width;
|
| 543 |
+
const h = this.outputCanvas.height;
|
| 544 |
+
|
| 545 |
+
ctx.strokeStyle = 'rgba(0, 240, 255, 0.3)';
|
| 546 |
+
ctx.lineWidth = 1;
|
| 547 |
+
ctx.beginPath();
|
| 548 |
+
const y = Math.random() * h;
|
| 549 |
+
ctx.moveTo(0, y);
|
| 550 |
+
ctx.lineTo(w, y);
|
| 551 |
+
ctx.stroke();
|
| 552 |
+
|
| 553 |
+
ctx.fillStyle = 'rgba(0, 240, 255, 0.5)';
|
| 554 |
+
// Attempt to get a more dynamic font size
|
| 555 |
+
const fontSize = Math.max(10, Math.min(w, h) / 30);
|
| 556 |
+
ctx.font = `${fontSize}px Arial`;
|
| 557 |
+
ctx.fillText(`Q-GATE-${Math.floor(Math.random()*100)}`, 10, y - 5);
|
| 558 |
+
}
|
| 559 |
+
|
| 560 |
+
drawStaticNoise() {
|
| 561 |
+
const w = this.outputCanvas.width;
|
| 562 |
+
const h = this.outputCanvas.height;
|
| 563 |
+
const id = this.outputCtx.createImageData(w, h);
|
| 564 |
+
const d = id.data;
|
| 565 |
+
|
| 566 |
+
for (let i = 0; i < d.length; i += 4) {
|
| 567 |
+
const v = Math.random() * 20; // Dark noise
|
| 568 |
+
d[i] = v; d[i+1] = v; d[i+2] = v + 10; d[i+3] = 255;
|
| 569 |
+
}
|
| 570 |
+
this.outputCtx.putImageData(id, 0, 0);
|
| 571 |
+
}
|
| 572 |
+
|
| 573 |
+
// renderDiffusionStep is no longer needed as phaseRealDiffusion and renderFrameTransition handle it.
|
| 574 |
+
// Keeping it as a placeholder/commented out for now if previous functionality needs reference.
|
| 575 |
+
// async renderDiffusionStep(step, totalSteps) { }
|
| 576 |
+
|
| 577 |
+
async downloadMovie() {
|
| 578 |
+
if (this.movieFrames.length === 0) return;
|
| 579 |
+
|
| 580 |
+
const btn = document.getElementById('download-btn');
|
| 581 |
+
const originalText = btn.innerHTML;
|
| 582 |
+
btn.disabled = true;
|
| 583 |
+
btn.innerHTML = 'RENDER...';
|
| 584 |
+
|
| 585 |
+
this.log('Starting Movie Rendering...', 'info');
|
| 586 |
+
|
| 587 |
+
try {
|
| 588 |
+
// Create a hidden canvas for playback
|
| 589 |
+
const canvas = document.createElement('canvas');
|
| 590 |
+
canvas.width = this.outputCanvas.width;
|
| 591 |
+
canvas.height = this.outputCanvas.height;
|
| 592 |
+
const ctx = canvas.getContext('2d');
|
| 593 |
+
|
| 594 |
+
// Setup MediaRecorder
|
| 595 |
+
const stream = canvas.captureStream(30); // 30 FPS
|
| 596 |
+
const mimeType = MediaRecorder.isTypeSupported('video/webm;codecs=vp9')
|
| 597 |
+
? 'video/webm;codecs=vp9'
|
| 598 |
+
: 'video/webm';
|
| 599 |
+
|
| 600 |
+
const recorder = new MediaRecorder(stream, {
|
| 601 |
+
mimeType: mimeType,
|
| 602 |
+
videoBitsPerSecond: 5000000 // 5Mbps
|
| 603 |
+
});
|
| 604 |
+
|
| 605 |
+
const chunks = [];
|
| 606 |
+
recorder.ondataavailable = (e) => {
|
| 607 |
+
if (e.data.size > 0) chunks.push(e.data);
|
| 608 |
+
};
|
| 609 |
+
|
| 610 |
+
recorder.onstop = () => {
|
| 611 |
+
const blob = new Blob(chunks, { type: 'video/webm' });
|
| 612 |
+
const url = URL.createObjectURL(blob);
|
| 613 |
+
const a = document.createElement('a');
|
| 614 |
+
a.href = url;
|
| 615 |
+
a.download = `wan-quantum-director-cut-${Date.now()}.webm`;
|
| 616 |
+
a.click();
|
| 617 |
+
URL.revokeObjectURL(url);
|
| 618 |
+
this.log('Movie Downloaded Successfully.', 'success');
|
| 619 |
+
btn.innerHTML = originalText;
|
| 620 |
+
btn.disabled = false;
|
| 621 |
+
};
|
| 622 |
+
|
| 623 |
+
recorder.start();
|
| 624 |
+
|
| 625 |
+
// Play frames into recorder
|
| 626 |
+
// We need to pace this to match the stream FPS roughly
|
| 627 |
+
const frameDuration = 1000 / 30;
|
| 628 |
+
|
| 629 |
+
for (const bitmap of this.movieFrames) {
|
| 630 |
+
ctx.drawImage(bitmap, 0, 0);
|
| 631 |
+
// Request dummy frame to keep stream active if needed,
|
| 632 |
+
// but loop should be enough if async enough.
|
| 633 |
+
// Actually, for captureStream to pick it up, we should wait a tick.
|
| 634 |
+
await new Promise(r => setTimeout(r, frameDuration));
|
| 635 |
+
}
|
| 636 |
+
|
| 637 |
+
recorder.stop();
|
| 638 |
+
|
| 639 |
+
} catch (e) {
|
| 640 |
+
this.log(`Export failed: ${e.message}`, 'error');
|
| 641 |
+
btn.innerHTML = originalText;
|
| 642 |
+
btn.disabled = false;
|
| 643 |
+
}
|
| 644 |
+
}
|
styles.css
ADDED
|
@@ -0,0 +1,803 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
@import url('https://fonts.googleapis.com/css2?family=Space+Mono:wght@400;700&family=Inter:wght@300;400;600;800&display=swap');
|
| 2 |
+
|
| 3 |
+
:root {
|
| 4 |
+
--bg-dark: #050510;
|
| 5 |
+
--bg-panel: rgba(20, 25, 40, 0.7);
|
| 6 |
+
--accent: #00f0ff;
|
| 7 |
+
--accent-glow: rgba(0, 240, 255, 0.3);
|
| 8 |
+
--secondary: #7000ff;
|
| 9 |
+
--text-main: #e0e6ed;
|
| 10 |
+
--text-dim: #94a3b8;
|
| 11 |
+
--border: rgba(255, 255, 255, 0.1);
|
| 12 |
+
--glass: blur(12px);
|
| 13 |
+
--font-mono: 'Space Mono', monospace;
|
| 14 |
+
--font-sans: 'Inter', sans-serif;
|
| 15 |
+
}
|
| 16 |
+
|
| 17 |
+
* {
|
| 18 |
+
box-sizing: border-box;
|
| 19 |
+
margin: 0;
|
| 20 |
+
padding: 0;
|
| 21 |
+
}
|
| 22 |
+
|
| 23 |
+
body {
|
| 24 |
+
background-color: var(--bg-dark);
|
| 25 |
+
color: var(--text-main);
|
| 26 |
+
font-family: var(--font-sans);
|
| 27 |
+
overflow: hidden; /* App-like feel */
|
| 28 |
+
height: 100vh;
|
| 29 |
+
background-image:
|
| 30 |
+
radial-gradient(circle at 10% 20%, rgba(112, 0, 255, 0.1) 0%, transparent 20%),
|
| 31 |
+
radial-gradient(circle at 90% 80%, rgba(0, 240, 255, 0.1) 0%, transparent 20%);
|
| 32 |
+
}
|
| 33 |
+
|
| 34 |
+
/* Layout */
|
| 35 |
+
.app-container {
|
| 36 |
+
display: grid;
|
| 37 |
+
grid-template-columns: 260px 1fr;
|
| 38 |
+
height: 100vh;
|
| 39 |
+
}
|
| 40 |
+
|
| 41 |
+
/* Side Navigation */
|
| 42 |
+
.side-nav {
|
| 43 |
+
background: rgba(10, 12, 20, 0.9);
|
| 44 |
+
border-right: 1px solid var(--border);
|
| 45 |
+
padding: 24px;
|
| 46 |
+
display: flex;
|
| 47 |
+
flex-direction: column;
|
| 48 |
+
gap: 32px;
|
| 49 |
+
}
|
| 50 |
+
|
| 51 |
+
.nav-brand {
|
| 52 |
+
display: flex;
|
| 53 |
+
align-items: center;
|
| 54 |
+
gap: 12px;
|
| 55 |
+
font-weight: 800;
|
| 56 |
+
font-size: 1.2rem;
|
| 57 |
+
letter-spacing: 1px;
|
| 58 |
+
color: var(--accent);
|
| 59 |
+
}
|
| 60 |
+
|
| 61 |
+
.brand-icon {
|
| 62 |
+
width: 32px;
|
| 63 |
+
height: 32px;
|
| 64 |
+
background: var(--accent);
|
| 65 |
+
color: #000;
|
| 66 |
+
display: flex;
|
| 67 |
+
align-items: center;
|
| 68 |
+
justify-content: center;
|
| 69 |
+
border-radius: 4px;
|
| 70 |
+
box-shadow: 0 0 15px var(--accent-glow);
|
| 71 |
+
}
|
| 72 |
+
|
| 73 |
+
.nav-links {
|
| 74 |
+
display: flex;
|
| 75 |
+
flex-direction: column;
|
| 76 |
+
gap: 8px;
|
| 77 |
+
}
|
| 78 |
+
|
| 79 |
+
.nav-link {
|
| 80 |
+
display: flex;
|
| 81 |
+
align-items: center;
|
| 82 |
+
gap: 12px;
|
| 83 |
+
padding: 12px;
|
| 84 |
+
text-decoration: none;
|
| 85 |
+
color: var(--text-dim);
|
| 86 |
+
border-radius: 8px;
|
| 87 |
+
transition: all 0.2s;
|
| 88 |
+
font-size: 0.9rem;
|
| 89 |
+
}
|
| 90 |
+
|
| 91 |
+
.nav-link:hover, .nav-link.active {
|
| 92 |
+
background: rgba(255, 255, 255, 0.05);
|
| 93 |
+
color: var(--text-main);
|
| 94 |
+
}
|
| 95 |
+
|
| 96 |
+
.nav-link.active {
|
| 97 |
+
border-left: 3px solid var(--accent);
|
| 98 |
+
background: linear-gradient(90deg, rgba(0, 240, 255, 0.1), transparent);
|
| 99 |
+
}
|
| 100 |
+
|
| 101 |
+
.nav-status {
|
| 102 |
+
margin-top: auto;
|
| 103 |
+
border-top: 1px solid var(--border);
|
| 104 |
+
padding-top: 24px;
|
| 105 |
+
}
|
| 106 |
+
|
| 107 |
+
.status-row {
|
| 108 |
+
display: flex;
|
| 109 |
+
justify-content: space-between;
|
| 110 |
+
align-items: center;
|
| 111 |
+
font-size: 0.8rem;
|
| 112 |
+
color: var(--text-dim);
|
| 113 |
+
margin-bottom: 12px;
|
| 114 |
+
}
|
| 115 |
+
|
| 116 |
+
.status-dot {
|
| 117 |
+
width: 8px;
|
| 118 |
+
height: 8px;
|
| 119 |
+
border-radius: 50%;
|
| 120 |
+
background: #333;
|
| 121 |
+
}
|
| 122 |
+
|
| 123 |
+
.status-dot.active {
|
| 124 |
+
background: #00ff88;
|
| 125 |
+
box-shadow: 0 0 8px #00ff88;
|
| 126 |
+
}
|
| 127 |
+
|
| 128 |
+
.mono {
|
| 129 |
+
font-family: var(--font-mono);
|
| 130 |
+
color: var(--accent);
|
| 131 |
+
}
|
| 132 |
+
|
| 133 |
+
/* Main Content */
|
| 134 |
+
.content-area {
|
| 135 |
+
overflow-y: auto;
|
| 136 |
+
padding: 32px;
|
| 137 |
+
position: relative;
|
| 138 |
+
}
|
| 139 |
+
|
| 140 |
+
.section {
|
| 141 |
+
display: none;
|
| 142 |
+
max-width: 1200px;
|
| 143 |
+
margin: 0 auto;
|
| 144 |
+
animation: fadeIn 0.4s ease-out;
|
| 145 |
+
}
|
| 146 |
+
|
| 147 |
+
.section.active {
|
| 148 |
+
display: block;
|
| 149 |
+
}
|
| 150 |
+
|
| 151 |
+
@keyframes fadeIn {
|
| 152 |
+
from { opacity: 0; transform: translateY(10px); }
|
| 153 |
+
to { opacity: 1; transform: translateY(0); }
|
| 154 |
+
}
|
| 155 |
+
|
| 156 |
+
/* Simulation Grid */
|
| 157 |
+
.simulation-grid {
|
| 158 |
+
display: grid;
|
| 159 |
+
grid-template-columns: 350px 1fr;
|
| 160 |
+
grid-template-rows: auto 250px;
|
| 161 |
+
gap: 24px;
|
| 162 |
+
height: calc(100vh - 120px);
|
| 163 |
+
}
|
| 164 |
+
|
| 165 |
+
.glass-panel {
|
| 166 |
+
background: var(--bg-panel);
|
| 167 |
+
backdrop-filter: var(--glass);
|
| 168 |
+
-webkit-backdrop-filter: var(--glass);
|
| 169 |
+
border: 1px solid var(--border);
|
| 170 |
+
border-radius: 12px;
|
| 171 |
+
padding: 24px;
|
| 172 |
+
box-shadow: 0 8px 32px rgba(0, 0, 0, 0.3);
|
| 173 |
+
display: flex;
|
| 174 |
+
flex-direction: column;
|
| 175 |
+
}
|
| 176 |
+
|
| 177 |
+
/* Inputs */
|
| 178 |
+
.control-panel {
|
| 179 |
+
grid-row: 1 / -1;
|
| 180 |
+
}
|
| 181 |
+
|
| 182 |
+
.input-group {
|
| 183 |
+
margin-bottom: 24px;
|
| 184 |
+
}
|
| 185 |
+
|
| 186 |
+
/* File Upload Styles */
|
| 187 |
+
.file-drop-zone {
|
| 188 |
+
border: 2px dashed var(--border);
|
| 189 |
+
border-radius: 8px;
|
| 190 |
+
padding: 20px;
|
| 191 |
+
text-align: center;
|
| 192 |
+
background: rgba(0,0,0,0.2);
|
| 193 |
+
cursor: pointer;
|
| 194 |
+
transition: all 0.3s ease;
|
| 195 |
+
position: relative;
|
| 196 |
+
overflow: hidden;
|
| 197 |
+
min-height: 120px;
|
| 198 |
+
display: flex;
|
| 199 |
+
align-items: center;
|
| 200 |
+
justify-content: center;
|
| 201 |
+
}
|
| 202 |
+
|
| 203 |
+
.file-drop-zone:hover, .file-drop-zone.drag-over {
|
| 204 |
+
border-color: var(--accent);
|
| 205 |
+
background: rgba(0, 240, 255, 0.05);
|
| 206 |
+
}
|
| 207 |
+
|
| 208 |
+
.drop-content {
|
| 209 |
+
display: flex;
|
| 210 |
+
flex-direction: column;
|
| 211 |
+
align-items: center;
|
| 212 |
+
gap: 8px;
|
| 213 |
+
pointer-events: none;
|
| 214 |
+
z-index: 2;
|
| 215 |
+
}
|
| 216 |
+
|
| 217 |
+
.drop-content .icon {
|
| 218 |
+
font-size: 2rem;
|
| 219 |
+
margin-bottom: 4px;
|
| 220 |
+
}
|
| 221 |
+
|
| 222 |
+
.drop-text {
|
| 223 |
+
font-weight: 700;
|
| 224 |
+
color: var(--text-main);
|
| 225 |
+
font-size: 0.9rem;
|
| 226 |
+
}
|
| 227 |
+
|
| 228 |
+
.drop-sub {
|
| 229 |
+
color: var(--text-dim);
|
| 230 |
+
font-size: 0.7rem;
|
| 231 |
+
}
|
| 232 |
+
|
| 233 |
+
#preview-img {
|
| 234 |
+
position: absolute;
|
| 235 |
+
top: 0;
|
| 236 |
+
left: 0;
|
| 237 |
+
width: 100%;
|
| 238 |
+
height: 100%;
|
| 239 |
+
object-fit: cover;
|
| 240 |
+
opacity: 0.6;
|
| 241 |
+
z-index: 1;
|
| 242 |
+
}
|
| 243 |
+
|
| 244 |
+
#preview-img.hidden {
|
| 245 |
+
display: none;
|
| 246 |
+
}
|
| 247 |
+
|
| 248 |
+
label {
|
| 249 |
+
display: flex;
|
| 250 |
+
justify-content: space-between;
|
| 251 |
+
margin-bottom: 8px;
|
| 252 |
+
font-size: 0.8rem;
|
| 253 |
+
text-transform: uppercase;
|
| 254 |
+
letter-spacing: 1px;
|
| 255 |
+
color: var(--text-dim);
|
| 256 |
+
font-weight: 600;
|
| 257 |
+
}
|
| 258 |
+
|
| 259 |
+
textarea {
|
| 260 |
+
width: 100%;
|
| 261 |
+
height: 100px;
|
| 262 |
+
background: rgba(0, 0, 0, 0.3);
|
| 263 |
+
border: 1px solid var(--border);
|
| 264 |
+
border-radius: 8px;
|
| 265 |
+
padding: 12px;
|
| 266 |
+
color: var(--text-main);
|
| 267 |
+
font-family: var(--font-sans);
|
| 268 |
+
resize: none;
|
| 269 |
+
transition: border-color 0.2s;
|
| 270 |
+
}
|
| 271 |
+
|
| 272 |
+
textarea:focus {
|
| 273 |
+
outline: none;
|
| 274 |
+
border-color: var(--accent);
|
| 275 |
+
}
|
| 276 |
+
|
| 277 |
+
input[type="range"] {
|
| 278 |
+
width: 100%;
|
| 279 |
+
background: transparent;
|
| 280 |
+
-webkit-appearance: none;
|
| 281 |
+
}
|
| 282 |
+
|
| 283 |
+
input[type="range"]::-webkit-slider-runnable-track {
|
| 284 |
+
height: 4px;
|
| 285 |
+
background: rgba(255, 255, 255, 0.1);
|
| 286 |
+
border-radius: 2px;
|
| 287 |
+
}
|
| 288 |
+
|
| 289 |
+
input[type="range"]::-webkit-slider-thumb {
|
| 290 |
+
-webkit-appearance: none;
|
| 291 |
+
height: 16px;
|
| 292 |
+
width: 16px;
|
| 293 |
+
border-radius: 50%;
|
| 294 |
+
background: var(--accent);
|
| 295 |
+
margin-top: -6px;
|
| 296 |
+
cursor: pointer;
|
| 297 |
+
box-shadow: 0 0 10px var(--accent-glow);
|
| 298 |
+
}
|
| 299 |
+
|
| 300 |
+
.slider-meta {
|
| 301 |
+
font-size: 0.7rem;
|
| 302 |
+
color: var(--text-dim);
|
| 303 |
+
margin-top: 4px;
|
| 304 |
+
}
|
| 305 |
+
|
| 306 |
+
.value-badge {
|
| 307 |
+
color: var(--accent);
|
| 308 |
+
font-family: var(--font-mono);
|
| 309 |
+
}
|
| 310 |
+
|
| 311 |
+
select {
|
| 312 |
+
width: 100%;
|
| 313 |
+
background: rgba(0, 0, 0, 0.3);
|
| 314 |
+
border: 1px solid var(--border);
|
| 315 |
+
color: var(--text-main);
|
| 316 |
+
padding: 10px;
|
| 317 |
+
border-radius: 6px;
|
| 318 |
+
}
|
| 319 |
+
|
| 320 |
+
.btn-primary {
|
| 321 |
+
width: 100%;
|
| 322 |
+
padding: 16px;
|
| 323 |
+
background: var(--accent);
|
| 324 |
+
color: #000;
|
| 325 |
+
border: none;
|
| 326 |
+
border-radius: 6px;
|
| 327 |
+
font-weight: 800;
|
| 328 |
+
cursor: pointer;
|
| 329 |
+
position: relative;
|
| 330 |
+
overflow: hidden;
|
| 331 |
+
transition: all 0.2s;
|
| 332 |
+
}
|
| 333 |
+
|
| 334 |
+
.btn-primary:hover {
|
| 335 |
+
background: #fff;
|
| 336 |
+
box-shadow: 0 0 20px var(--accent-glow);
|
| 337 |
+
}
|
| 338 |
+
|
| 339 |
+
.btn-primary:disabled {
|
| 340 |
+
background: #333;
|
| 341 |
+
color: #666;
|
| 342 |
+
cursor: not-allowed;
|
| 343 |
+
}
|
| 344 |
+
|
| 345 |
+
/* Director Mode Styles */
|
| 346 |
+
.director-controls {
|
| 347 |
+
background: rgba(0, 0, 0, 0.3);
|
| 348 |
+
border: 1px solid var(--secondary);
|
| 349 |
+
border-radius: 8px;
|
| 350 |
+
padding: 16px;
|
| 351 |
+
margin-bottom: 24px;
|
| 352 |
+
}
|
| 353 |
+
|
| 354 |
+
.director-header {
|
| 355 |
+
display: flex;
|
| 356 |
+
justify-content: space-between;
|
| 357 |
+
align-items: center;
|
| 358 |
+
margin-bottom: 12px;
|
| 359 |
+
}
|
| 360 |
+
|
| 361 |
+
.switch-container {
|
| 362 |
+
display: flex;
|
| 363 |
+
align-items: center;
|
| 364 |
+
gap: 12px;
|
| 365 |
+
cursor: pointer;
|
| 366 |
+
}
|
| 367 |
+
|
| 368 |
+
.switch-container input {
|
| 369 |
+
display: none;
|
| 370 |
+
}
|
| 371 |
+
|
| 372 |
+
.toggle-slider {
|
| 373 |
+
width: 40px;
|
| 374 |
+
height: 20px;
|
| 375 |
+
background: #333;
|
| 376 |
+
border-radius: 20px;
|
| 377 |
+
position: relative;
|
| 378 |
+
transition: all 0.3s;
|
| 379 |
+
}
|
| 380 |
+
|
| 381 |
+
.toggle-slider::before {
|
| 382 |
+
content: '';
|
| 383 |
+
position: absolute;
|
| 384 |
+
width: 16px;
|
| 385 |
+
height: 16px;
|
| 386 |
+
background: #fff;
|
| 387 |
+
border-radius: 50%;
|
| 388 |
+
top: 2px;
|
| 389 |
+
left: 2px;
|
| 390 |
+
transition: all 0.3s;
|
| 391 |
+
}
|
| 392 |
+
|
| 393 |
+
input:checked + .toggle-slider {
|
| 394 |
+
background: var(--secondary);
|
| 395 |
+
}
|
| 396 |
+
|
| 397 |
+
input:checked + .toggle-slider::before {
|
| 398 |
+
transform: translateX(20px);
|
| 399 |
+
}
|
| 400 |
+
|
| 401 |
+
.frame-count {
|
| 402 |
+
font-family: var(--font-mono);
|
| 403 |
+
color: var(--secondary);
|
| 404 |
+
font-size: 0.8rem;
|
| 405 |
+
}
|
| 406 |
+
|
| 407 |
+
.director-actions {
|
| 408 |
+
display: flex;
|
| 409 |
+
gap: 12px;
|
| 410 |
+
}
|
| 411 |
+
|
| 412 |
+
.btn-secondary, .btn-danger {
|
| 413 |
+
flex: 1;
|
| 414 |
+
padding: 10px;
|
| 415 |
+
border: none;
|
| 416 |
+
border-radius: 4px;
|
| 417 |
+
font-size: 0.8rem;
|
| 418 |
+
font-weight: 700;
|
| 419 |
+
cursor: pointer;
|
| 420 |
+
display: flex;
|
| 421 |
+
align-items: center;
|
| 422 |
+
justify-content: center;
|
| 423 |
+
gap: 8px;
|
| 424 |
+
transition: all 0.2s;
|
| 425 |
+
}
|
| 426 |
+
|
| 427 |
+
.btn-secondary {
|
| 428 |
+
background: rgba(255, 255, 255, 0.1);
|
| 429 |
+
color: var(--text-main);
|
| 430 |
+
border: 1px solid var(--border);
|
| 431 |
+
}
|
| 432 |
+
|
| 433 |
+
.btn-secondary:hover:not(:disabled) {
|
| 434 |
+
background: var(--text-main);
|
| 435 |
+
color: #000;
|
| 436 |
+
}
|
| 437 |
+
|
| 438 |
+
.btn-danger {
|
| 439 |
+
background: rgba(255, 0, 85, 0.1);
|
| 440 |
+
color: #ff0055;
|
| 441 |
+
border: 1px solid rgba(255, 0, 85, 0.3);
|
| 442 |
+
}
|
| 443 |
+
|
| 444 |
+
.btn-danger:hover:not(:disabled) {
|
| 445 |
+
background: #ff0055;
|
| 446 |
+
color: #fff;
|
| 447 |
+
}
|
| 448 |
+
|
| 449 |
+
.btn-secondary:disabled, .btn-danger:disabled {
|
| 450 |
+
opacity: 0.5;
|
| 451 |
+
cursor: not-allowed;
|
| 452 |
+
}
|
| 453 |
+
|
| 454 |
+
|
| 455 |
+
/* Visualization Panel */
|
| 456 |
+
.visualization-panel {
|
| 457 |
+
grid-column: 2;
|
| 458 |
+
grid-row: 1;
|
| 459 |
+
padding: 0; /* Custom padding for tabs */
|
| 460 |
+
overflow: hidden;
|
| 461 |
+
}
|
| 462 |
+
|
| 463 |
+
.viz-tabs {
|
| 464 |
+
display: flex;
|
| 465 |
+
background: rgba(0, 0, 0, 0.2);
|
| 466 |
+
border-bottom: 1px solid var(--border);
|
| 467 |
+
}
|
| 468 |
+
|
| 469 |
+
.viz-tab {
|
| 470 |
+
padding: 12px 24px;
|
| 471 |
+
background: transparent;
|
| 472 |
+
border: none;
|
| 473 |
+
color: var(--text-dim);
|
| 474 |
+
cursor: pointer;
|
| 475 |
+
font-family: var(--font-mono);
|
| 476 |
+
font-size: 0.8rem;
|
| 477 |
+
border-right: 1px solid var(--border);
|
| 478 |
+
}
|
| 479 |
+
|
| 480 |
+
.viz-tab.active {
|
| 481 |
+
color: var(--accent);
|
| 482 |
+
background: rgba(0, 240, 255, 0.05);
|
| 483 |
+
}
|
| 484 |
+
|
| 485 |
+
.viz-content {
|
| 486 |
+
flex: 1;
|
| 487 |
+
position: relative;
|
| 488 |
+
background: #000;
|
| 489 |
+
}
|
| 490 |
+
|
| 491 |
+
.viz-view {
|
| 492 |
+
display: none;
|
| 493 |
+
width: 100%;
|
| 494 |
+
height: 100%;
|
| 495 |
+
}
|
| 496 |
+
|
| 497 |
+
.viz-view.active {
|
| 498 |
+
display: block;
|
| 499 |
+
}
|
| 500 |
+
|
| 501 |
+
canvas {
|
| 502 |
+
width: 100%;
|
| 503 |
+
height: 100%;
|
| 504 |
+
display: block;
|
| 505 |
+
}
|
| 506 |
+
|
| 507 |
+
.overlay-stats {
|
| 508 |
+
position: absolute;
|
| 509 |
+
top: 50%;
|
| 510 |
+
left: 50%;
|
| 511 |
+
transform: translate(-50%, -50%);
|
| 512 |
+
font-family: var(--font-mono);
|
| 513 |
+
color: var(--accent);
|
| 514 |
+
background: rgba(0, 0, 0, 0.8);
|
| 515 |
+
padding: 8px 16px;
|
| 516 |
+
border: 1px solid var(--accent);
|
| 517 |
+
pointer-events: none;
|
| 518 |
+
}
|
| 519 |
+
|
| 520 |
+
.scanline {
|
| 521 |
+
position: absolute;
|
| 522 |
+
top: 0;
|
| 523 |
+
left: 0;
|
| 524 |
+
width: 100%;
|
| 525 |
+
height: 4px;
|
| 526 |
+
background: rgba(0, 240, 255, 0.3);
|
| 527 |
+
opacity: 0.3;
|
| 528 |
+
animation: scan 3s linear infinite;
|
| 529 |
+
pointer-events: none;
|
| 530 |
+
}
|
| 531 |
+
|
| 532 |
+
@keyframes scan {
|
| 533 |
+
0% { top: 0%; }
|
| 534 |
+
100% { top: 100%; }
|
| 535 |
+
}
|
| 536 |
+
|
| 537 |
+
.entropy-readout {
|
| 538 |
+
position: absolute;
|
| 539 |
+
bottom: 16px;
|
| 540 |
+
right: 16px;
|
| 541 |
+
font-family: var(--font-mono);
|
| 542 |
+
background: rgba(0,0,0,0.6);
|
| 543 |
+
padding: 4px 8px;
|
| 544 |
+
border-radius: 4px;
|
| 545 |
+
color: var(--secondary);
|
| 546 |
+
}
|
| 547 |
+
|
| 548 |
+
/* Terminal */
|
| 549 |
+
.terminal-panel {
|
| 550 |
+
grid-column: 2;
|
| 551 |
+
grid-row: 2;
|
| 552 |
+
font-family: var(--font-mono);
|
| 553 |
+
font-size: 0.85rem;
|
| 554 |
+
padding: 0;
|
| 555 |
+
background: #0a0a12;
|
| 556 |
+
}
|
| 557 |
+
|
| 558 |
+
.terminal-header {
|
| 559 |
+
padding: 8px 16px;
|
| 560 |
+
background: rgba(255, 255, 255, 0.05);
|
| 561 |
+
border-bottom: 1px solid var(--border);
|
| 562 |
+
display: flex;
|
| 563 |
+
justify-content: space-between;
|
| 564 |
+
color: var(--text-dim);
|
| 565 |
+
font-size: 0.7rem;
|
| 566 |
+
text-transform: uppercase;
|
| 567 |
+
}
|
| 568 |
+
|
| 569 |
+
.terminal-status {
|
| 570 |
+
color: #00ff88;
|
| 571 |
+
}
|
| 572 |
+
|
| 573 |
+
.terminal-body {
|
| 574 |
+
padding: 16px;
|
| 575 |
+
overflow-y: auto;
|
| 576 |
+
color: #d0d0d0;
|
| 577 |
+
height: 100%;
|
| 578 |
+
display: flex;
|
| 579 |
+
flex-direction: column;
|
| 580 |
+
gap: 4px;
|
| 581 |
+
}
|
| 582 |
+
|
| 583 |
+
.log-line {
|
| 584 |
+
opacity: 0.8;
|
| 585 |
+
border-left: 2px solid transparent;
|
| 586 |
+
padding-left: 8px;
|
| 587 |
+
}
|
| 588 |
+
|
| 589 |
+
.log-line.info { border-color: var(--accent); }
|
| 590 |
+
.log-line.warn { border-color: #ffcc00; color: #ffcc00; }
|
| 591 |
+
.log-line.error { border-color: #ff0055; color: #ff0055; }
|
| 592 |
+
.log-line.success { border-color: #00ff88; color: #00ff88; }
|
| 593 |
+
|
| 594 |
+
.ts {
|
| 595 |
+
color: #666;
|
| 596 |
+
margin-right: 8px;
|
| 597 |
+
}
|
| 598 |
+
|
| 599 |
+
/* Document Styles */
|
| 600 |
+
.document-wrapper {
|
| 601 |
+
max-width: 900px;
|
| 602 |
+
margin: 0 auto;
|
| 603 |
+
line-height: 1.8;
|
| 604 |
+
}
|
| 605 |
+
|
| 606 |
+
.document-wrapper h1 {
|
| 607 |
+
font-size: 2.5rem;
|
| 608 |
+
margin-bottom: 16px;
|
| 609 |
+
background: linear-gradient(90deg, #fff, var(--text-dim));
|
| 610 |
+
-webkit-background-clip: text;
|
| 611 |
+
-webkit-text-fill-color: transparent;
|
| 612 |
+
}
|
| 613 |
+
|
| 614 |
+
.lead {
|
| 615 |
+
font-size: 1.1rem;
|
| 616 |
+
color: var(--text-dim);
|
| 617 |
+
margin-bottom: 32px;
|
| 618 |
+
border-left: 4px solid var(--accent);
|
| 619 |
+
padding-left: 16px;
|
| 620 |
+
}
|
| 621 |
+
|
| 622 |
+
h3 {
|
| 623 |
+
margin: 32px 0 16px;
|
| 624 |
+
color: var(--accent);
|
| 625 |
+
font-family: var(--font-mono);
|
| 626 |
+
text-transform: uppercase;
|
| 627 |
+
font-size: 1rem;
|
| 628 |
+
letter-spacing: 1px;
|
| 629 |
+
}
|
| 630 |
+
|
| 631 |
+
p {
|
| 632 |
+
margin-bottom: 16px;
|
| 633 |
+
color: #c0c0c0;
|
| 634 |
+
}
|
| 635 |
+
|
| 636 |
+
.metric-cards {
|
| 637 |
+
display: grid;
|
| 638 |
+
grid-template-columns: repeat(3, 1fr);
|
| 639 |
+
gap: 16px;
|
| 640 |
+
margin: 32px 0;
|
| 641 |
+
}
|
| 642 |
+
|
| 643 |
+
.card {
|
| 644 |
+
background: rgba(255, 255, 255, 0.05);
|
| 645 |
+
padding: 24px;
|
| 646 |
+
border-radius: 8px;
|
| 647 |
+
text-align: center;
|
| 648 |
+
border: 1px solid var(--border);
|
| 649 |
+
}
|
| 650 |
+
|
| 651 |
+
.metric-val {
|
| 652 |
+
font-size: 2rem;
|
| 653 |
+
font-weight: 800;
|
| 654 |
+
color: #fff;
|
| 655 |
+
margin-bottom: 4px;
|
| 656 |
+
}
|
| 657 |
+
|
| 658 |
+
.metric-label {
|
| 659 |
+
font-size: 0.7rem;
|
| 660 |
+
text-transform: uppercase;
|
| 661 |
+
color: var(--text-dim);
|
| 662 |
+
}
|
| 663 |
+
|
| 664 |
+
.specs-table {
|
| 665 |
+
width: 100%;
|
| 666 |
+
border: 1px solid var(--border);
|
| 667 |
+
border-radius: 8px;
|
| 668 |
+
overflow: hidden;
|
| 669 |
+
margin: 24px 0;
|
| 670 |
+
}
|
| 671 |
+
|
| 672 |
+
.spec-row {
|
| 673 |
+
display: grid;
|
| 674 |
+
grid-template-columns: 1fr 1fr 2fr 1fr;
|
| 675 |
+
padding: 12px 16px;
|
| 676 |
+
border-bottom: 1px solid var(--border);
|
| 677 |
+
}
|
| 678 |
+
|
| 679 |
+
.spec-row:last-child {
|
| 680 |
+
border-bottom: none;
|
| 681 |
+
}
|
| 682 |
+
|
| 683 |
+
.spec-row.header {
|
| 684 |
+
background: rgba(255, 255, 255, 0.05);
|
| 685 |
+
font-weight: bold;
|
| 686 |
+
color: var(--accent);
|
| 687 |
+
font-family: var(--font-mono);
|
| 688 |
+
font-size: 0.8rem;
|
| 689 |
+
}
|
| 690 |
+
|
| 691 |
+
.spec-row span {
|
| 692 |
+
font-size: 0.9rem;
|
| 693 |
+
}
|
| 694 |
+
|
| 695 |
+
/* Timeline */
|
| 696 |
+
.timeline {
|
| 697 |
+
position: relative;
|
| 698 |
+
margin: 32px 0;
|
| 699 |
+
padding-left: 32px;
|
| 700 |
+
}
|
| 701 |
+
|
| 702 |
+
.timeline::before {
|
| 703 |
+
content: '';
|
| 704 |
+
position: absolute;
|
| 705 |
+
left: 0;
|
| 706 |
+
top: 0;
|
| 707 |
+
bottom: 0;
|
| 708 |
+
width: 2px;
|
| 709 |
+
background: var(--border);
|
| 710 |
+
}
|
| 711 |
+
|
| 712 |
+
.timeline-item {
|
| 713 |
+
position: relative;
|
| 714 |
+
margin-bottom: 24px;
|
| 715 |
+
}
|
| 716 |
+
|
| 717 |
+
.timeline-item::before {
|
| 718 |
+
content: '';
|
| 719 |
+
position: absolute;
|
| 720 |
+
left: -37px;
|
| 721 |
+
top: 6px;
|
| 722 |
+
width: 12px;
|
| 723 |
+
height: 12px;
|
| 724 |
+
background: var(--bg-dark);
|
| 725 |
+
border: 2px solid var(--accent);
|
| 726 |
+
border-radius: 50%;
|
| 727 |
+
}
|
| 728 |
+
|
| 729 |
+
.phase {
|
| 730 |
+
font-family: var(--font-mono);
|
| 731 |
+
color: var(--accent);
|
| 732 |
+
font-size: 0.8rem;
|
| 733 |
+
margin-bottom: 4px;
|
| 734 |
+
}
|
| 735 |
+
|
| 736 |
+
/* Architecture specific */
|
| 737 |
+
.architecture-grid {
|
| 738 |
+
display: grid;
|
| 739 |
+
grid-template-columns: 1fr 1fr;
|
| 740 |
+
gap: 24px;
|
| 741 |
+
margin-top: 32px;
|
| 742 |
+
}
|
| 743 |
+
|
| 744 |
+
.arch-card {
|
| 745 |
+
background: rgba(255,255,255,0.03);
|
| 746 |
+
padding: 24px;
|
| 747 |
+
border-radius: 8px;
|
| 748 |
+
border: 1px solid var(--border);
|
| 749 |
+
}
|
| 750 |
+
|
| 751 |
+
.arch-card h3 {
|
| 752 |
+
margin-top: 0;
|
| 753 |
+
font-size: 0.9rem;
|
| 754 |
+
}
|
| 755 |
+
|
| 756 |
+
.arch-card ul {
|
| 757 |
+
list-style: none;
|
| 758 |
+
padding: 0;
|
| 759 |
+
}
|
| 760 |
+
|
| 761 |
+
.arch-card li {
|
| 762 |
+
margin-bottom: 8px;
|
| 763 |
+
font-size: 0.85rem;
|
| 764 |
+
padding-left: 16px;
|
| 765 |
+
position: relative;
|
| 766 |
+
}
|
| 767 |
+
|
| 768 |
+
.arch-card li::before {
|
| 769 |
+
content: '›';
|
| 770 |
+
position: absolute;
|
| 771 |
+
left: 0;
|
| 772 |
+
color: var(--accent);
|
| 773 |
+
}
|
| 774 |
+
|
| 775 |
+
#architecture-canvas {
|
| 776 |
+
height: 300px;
|
| 777 |
+
width: 100%;
|
| 778 |
+
background: #0f111a;
|
| 779 |
+
border-radius: 8px;
|
| 780 |
+
margin: 24px 0;
|
| 781 |
+
}
|
| 782 |
+
|
| 783 |
+
@media (max-width: 1024px) {
|
| 784 |
+
.app-container {
|
| 785 |
+
grid-template-columns: 1fr;
|
| 786 |
+
}
|
| 787 |
+
.side-nav {
|
| 788 |
+
display: none;
|
| 789 |
+
}
|
| 790 |
+
.simulation-grid {
|
| 791 |
+
grid-template-columns: 1fr;
|
| 792 |
+
grid-template-rows: auto auto auto;
|
| 793 |
+
}
|
| 794 |
+
.visualization-panel {
|
| 795 |
+
grid-column: 1;
|
| 796 |
+
grid-row: 2;
|
| 797 |
+
height: 400px;
|
| 798 |
+
}
|
| 799 |
+
.terminal-panel {
|
| 800 |
+
grid-column: 1;
|
| 801 |
+
grid-row: 3;
|
| 802 |
+
}
|
| 803 |
+
}
|