Spaces:
Runtime error
Runtime error
Commit ·
98129a3
0
Parent(s):
initial commit
Browse files- .gitattributes +1 -0
- .gitingore +1 -0
- Procfile +1 -0
- README.md +134 -0
- __pycache__/helper.cpython-313.pyc +0 -0
- __pycache__/main.cpython-313.pyc +0 -0
- helper.py +138 -0
- main.py +56 -0
- netG_A2B_epoch130.pth +3 -0
- netG_B2A_epoch130.pth +3 -0
- requirements.txt +9 -0
- static/script.js +128 -0
- static/style.css +20 -0
- templates/index.html +775 -0
.gitattributes
ADDED
|
@@ -0,0 +1 @@
|
|
|
|
|
|
|
| 1 |
+
*.pth filter=lfs diff=lfs merge=lfs -text
|
.gitingore
ADDED
|
@@ -0,0 +1 @@
|
|
|
|
|
|
|
| 1 |
+
/venv
|
Procfile
ADDED
|
@@ -0,0 +1 @@
|
|
|
|
|
|
|
| 1 |
+
web: uvicorn main:app --host 0.0.0.0 --port $PORT
|
README.md
ADDED
|
@@ -0,0 +1,134 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
|
| 2 |
+
# FaceAging AI — Realistic Face Aging and De-Aging with AI
|
| 3 |
+
|
| 4 |
+
**FaceAging AI** is an advanced AI-powered web application that transforms face images to appear older or younger with realistic results. It leverages deep learning models for face detection and age transformation, offering an intuitive interface for users to upload images and see instant aged or de-aged outputs.
|
| 5 |
+
|
| 6 |
+
---
|
| 7 |
+
|
| 8 |
+
## Features
|
| 9 |
+
|
| 10 |
+
* **Face Aging & De-Aging**: Convert young faces to old and vice versa with high visual fidelity.
|
| 11 |
+
* **Automatic Face Detection**: Detects faces in uploaded images using OpenCV to process only valid faces.
|
| 12 |
+
* **Base64 Image Encoding**: Returns transformed images efficiently encoded for seamless frontend display.
|
| 13 |
+
* **FastAPI Backend**: Robust and scalable backend API handling image processing and AI inference.
|
| 14 |
+
* **Simple, Responsive UI**: User-friendly frontend using HTML, CSS, JavaScript, and Jinja2 templates.
|
| 15 |
+
* **CORS Enabled**: Allows cross-origin requests for flexible frontend-backend integration.
|
| 16 |
+
|
| 17 |
+
---
|
| 18 |
+
|
| 19 |
+
## Tech Stack
|
| 20 |
+
|
| 21 |
+
* **Backend**: FastAPI (Python)
|
| 22 |
+
* **Frontend**: HTML, CSS, JavaScript, Jinja2 Templates
|
| 23 |
+
* **AI & Image Processing**: OpenCV, Pillow, NumPy, Custom Face Aging Models
|
| 24 |
+
* **Deployment**: Cloud-ready (Render, Heroku, or any ASGI-compatible platform)
|
| 25 |
+
|
| 26 |
+
---
|
| 27 |
+
|
| 28 |
+
## Project Structure
|
| 29 |
+
|
| 30 |
+
```
|
| 31 |
+
faceaging-ai/
|
| 32 |
+
├── main.py # FastAPI app entry point with endpoints
|
| 33 |
+
├── helper.py # AI face aging helper functions & models
|
| 34 |
+
├── static/ # Static files (CSS, JS, images)
|
| 35 |
+
├── templates/ # HTML templates (Jinja2)
|
| 36 |
+
├── requirements.txt # Python dependencies
|
| 37 |
+
├── README.md # Project documentation (this file)
|
| 38 |
+
└── Procfile # For deployment (if using Heroku/Render)
|
| 39 |
+
```
|
| 40 |
+
|
| 41 |
+
---
|
| 42 |
+
|
| 43 |
+
## Setup and Installation
|
| 44 |
+
|
| 45 |
+
### Prerequisites
|
| 46 |
+
|
| 47 |
+
* Python 3.8+
|
| 48 |
+
* Virtual environment tool (venv or conda)
|
| 49 |
+
* FastAPI, Uvicorn, OpenCV, Pillow, NumPy (see requirements.txt)
|
| 50 |
+
|
| 51 |
+
### Steps
|
| 52 |
+
|
| 53 |
+
1. **Clone the repository**
|
| 54 |
+
|
| 55 |
+
```bash
|
| 56 |
+
git clone https://github.com/parthmax2/faceaging-ai.git
|
| 57 |
+
cd faceaging-ai
|
| 58 |
+
```
|
| 59 |
+
|
| 60 |
+
2. **Create and activate a virtual environment**
|
| 61 |
+
|
| 62 |
+
```bash
|
| 63 |
+
python -m venv venv
|
| 64 |
+
source venv/bin/activate # Windows: venv\Scripts\activate
|
| 65 |
+
```
|
| 66 |
+
|
| 67 |
+
3. **Install dependencies**
|
| 68 |
+
|
| 69 |
+
```bash
|
| 70 |
+
pip install -r requirements.txt
|
| 71 |
+
```
|
| 72 |
+
|
| 73 |
+
4. **Run the FastAPI development server**
|
| 74 |
+
|
| 75 |
+
```bash
|
| 76 |
+
uvicorn main:app --reload
|
| 77 |
+
```
|
| 78 |
+
|
| 79 |
+
5. **Access the app**
|
| 80 |
+
|
| 81 |
+
Open your browser and navigate to:
|
| 82 |
+
|
| 83 |
+
```
|
| 84 |
+
http://127.0.0.1:8000
|
| 85 |
+
```
|
| 86 |
+
|
| 87 |
+
Upload a face image, select the conversion type (Young to Old or Old to Young), and click Generate to see the transformed image.
|
| 88 |
+
|
| 89 |
+
---
|
| 90 |
+
|
| 91 |
+
## API Endpoints
|
| 92 |
+
|
| 93 |
+
* `GET /` — Serves the main web interface.
|
| 94 |
+
* `POST /convert/` — Accepts an image file and conversion type, returns the aged or de-aged image as a base64 string.
|
| 95 |
+
|
| 96 |
+
---
|
| 97 |
+
|
| 98 |
+
## Deployment
|
| 99 |
+
|
| 100 |
+
You can deploy this FastAPI app on any ASGI-compatible platform:
|
| 101 |
+
|
| 102 |
+
* **Render:** Easy cloud deployment with automatic Dockerfile or Python environment detection.
|
| 103 |
+
* **Heroku:** Use the provided `Procfile` and `requirements.txt`.
|
| 104 |
+
* **Other platforms:** Ensure support for Python 3.8+, ASGI, and WebSocket if needed.
|
| 105 |
+
|
| 106 |
+
---
|
| 107 |
+
|
| 108 |
+
## Contribution
|
| 109 |
+
|
| 110 |
+
Contributions are welcome! Please open issues or submit pull requests for:
|
| 111 |
+
|
| 112 |
+
* Improving model accuracy
|
| 113 |
+
* Enhancing UI/UX
|
| 114 |
+
* Adding new features or endpoints
|
| 115 |
+
* Optimizing performance
|
| 116 |
+
|
| 117 |
+
---
|
| 118 |
+
|
| 119 |
+
## License
|
| 120 |
+
|
| 121 |
+
MIT License — free to use, modify, and distribute.
|
| 122 |
+
|
| 123 |
+
---
|
| 124 |
+
|
| 125 |
+
## Contact
|
| 126 |
+
|
| 127 |
+
**Saksham Pathak**
|
| 128 |
+
Master’s in Artificial Intelligence & Machine Learning, IIIT Lucknow
|
| 129 |
+
[GitHub](https://github.com/parthmax2) | [LinkedIn](https://linkedin.com/in/sakshampathak) | [Instagram](https://instagram.com/parthmax_)
|
| 130 |
+
|
| 131 |
+
---
|
| 132 |
+
|
| 133 |
+
*FaceAging AI © Saksham Pathak. Powered by open-source AI and computer vision technologies.*
|
| 134 |
+
|
__pycache__/helper.cpython-313.pyc
ADDED
|
Binary file (7.66 kB). View file
|
|
|
__pycache__/main.cpython-313.pyc
ADDED
|
Binary file (2.89 kB). View file
|
|
|
helper.py
ADDED
|
@@ -0,0 +1,138 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import torch
|
| 2 |
+
import torch.nn as nn
|
| 3 |
+
import numpy as np
|
| 4 |
+
import cv2
|
| 5 |
+
from torchvision import transforms
|
| 6 |
+
from PIL import Image
|
| 7 |
+
|
| 8 |
+
input_nc = 3
|
| 9 |
+
output_nc = 3
|
| 10 |
+
|
| 11 |
+
# Squeeze-and-Excitation Block
|
| 12 |
+
class SEBlock(nn.Module):
|
| 13 |
+
def __init__(self, channel, reduction=16):
|
| 14 |
+
super(SEBlock, self).__init__()
|
| 15 |
+
self.fc = nn.Sequential(
|
| 16 |
+
nn.AdaptiveAvgPool2d(1),
|
| 17 |
+
nn.Conv2d(channel, channel // reduction, 1),
|
| 18 |
+
nn.ReLU(inplace=True),
|
| 19 |
+
nn.Conv2d(channel // reduction, channel, 1),
|
| 20 |
+
nn.Sigmoid()
|
| 21 |
+
)
|
| 22 |
+
|
| 23 |
+
def forward(self, x):
|
| 24 |
+
weights = self.fc(x)
|
| 25 |
+
return x * weights
|
| 26 |
+
|
| 27 |
+
# Residual Block with SE
|
| 28 |
+
class ResnetBlock(nn.Module):
|
| 29 |
+
def __init__(self, dim, reduction=16):
|
| 30 |
+
super(ResnetBlock, self).__init__()
|
| 31 |
+
self.conv_block = self.build_conv_block(dim)
|
| 32 |
+
self.se = SEBlock(dim, reduction)
|
| 33 |
+
|
| 34 |
+
def build_conv_block(self, dim):
|
| 35 |
+
conv_block = [
|
| 36 |
+
nn.ReflectionPad2d(1),
|
| 37 |
+
nn.Conv2d(dim, dim, kernel_size=3, padding=0),
|
| 38 |
+
nn.InstanceNorm2d(dim),
|
| 39 |
+
nn.ReLU(True),
|
| 40 |
+
nn.ReflectionPad2d(1),
|
| 41 |
+
nn.Conv2d(dim, dim, kernel_size=3, padding=0),
|
| 42 |
+
nn.InstanceNorm2d(dim)
|
| 43 |
+
]
|
| 44 |
+
return nn.Sequential(*conv_block)
|
| 45 |
+
|
| 46 |
+
def forward(self, x):
|
| 47 |
+
out = self.conv_block(x)
|
| 48 |
+
out = self.se(out)
|
| 49 |
+
return x + out
|
| 50 |
+
|
| 51 |
+
# Generator Network
|
| 52 |
+
class GeneratorResNet(nn.Module):
|
| 53 |
+
def __init__(self, input_nc, output_nc, n_residual_blocks=9):
|
| 54 |
+
super(GeneratorResNet, self).__init__()
|
| 55 |
+
|
| 56 |
+
# Initial convolution block
|
| 57 |
+
model = [
|
| 58 |
+
nn.ReflectionPad2d(3),
|
| 59 |
+
nn.Conv2d(input_nc, 64, 7),
|
| 60 |
+
nn.InstanceNorm2d(64),
|
| 61 |
+
nn.ReLU(inplace=True)
|
| 62 |
+
]
|
| 63 |
+
|
| 64 |
+
# Downsampling
|
| 65 |
+
in_features = 64
|
| 66 |
+
out_features = in_features * 2
|
| 67 |
+
for _ in range(2):
|
| 68 |
+
model += [
|
| 69 |
+
nn.Conv2d(in_features, out_features, 3, stride=2, padding=1),
|
| 70 |
+
nn.InstanceNorm2d(out_features),
|
| 71 |
+
nn.ReLU(inplace=True)
|
| 72 |
+
]
|
| 73 |
+
in_features = out_features
|
| 74 |
+
out_features = in_features * 2
|
| 75 |
+
|
| 76 |
+
# Residual blocks
|
| 77 |
+
for _ in range(n_residual_blocks):
|
| 78 |
+
model += [ResnetBlock(in_features)]
|
| 79 |
+
|
| 80 |
+
# Upsampling
|
| 81 |
+
out_features = in_features // 2
|
| 82 |
+
for _ in range(2):
|
| 83 |
+
model += [
|
| 84 |
+
nn.ConvTranspose2d(in_features, out_features, 3, stride=2, padding=1, output_padding=1),
|
| 85 |
+
nn.InstanceNorm2d(out_features),
|
| 86 |
+
nn.ReLU(inplace=True)
|
| 87 |
+
]
|
| 88 |
+
in_features = out_features
|
| 89 |
+
out_features = in_features // 2
|
| 90 |
+
|
| 91 |
+
# Output layer
|
| 92 |
+
model += [
|
| 93 |
+
nn.ReflectionPad2d(3),
|
| 94 |
+
nn.Conv2d(64, output_nc, 7),
|
| 95 |
+
nn.Tanh()
|
| 96 |
+
]
|
| 97 |
+
|
| 98 |
+
self.model = nn.Sequential(*model)
|
| 99 |
+
|
| 100 |
+
def forward(self, x):
|
| 101 |
+
return self.model(x)
|
| 102 |
+
|
| 103 |
+
# Instantiate models
|
| 104 |
+
netG_A2B = GeneratorResNet(input_nc, output_nc)
|
| 105 |
+
netG_B2A = GeneratorResNet(input_nc, output_nc)
|
| 106 |
+
|
| 107 |
+
# Load model weights
|
| 108 |
+
device = 'cpu'
|
| 109 |
+
netG_A2B.load_state_dict(torch.load('./netG_A2B_epoch130.pth', map_location=device))
|
| 110 |
+
netG_B2A.load_state_dict(torch.load('./netG_B2A_epoch130.pth', map_location=device))
|
| 111 |
+
|
| 112 |
+
# Image transformation functions
|
| 113 |
+
def generate_Y2O(uploaded_image):
|
| 114 |
+
to_tensor = transforms.ToTensor()
|
| 115 |
+
tensor = to_tensor(uploaded_image).unsqueeze(0) # Add batch dimension
|
| 116 |
+
old = netG_A2B(tensor)
|
| 117 |
+
return (old.squeeze().detach().permute(1, 2, 0).numpy() + 1) / 2
|
| 118 |
+
|
| 119 |
+
def generate_O2Y(uploaded_image):
|
| 120 |
+
img = cv2.resize(uploaded_image, (256, 256))
|
| 121 |
+
to_tensor = transforms.ToTensor()
|
| 122 |
+
tensor = to_tensor(img).unsqueeze(0) # Add batch dimension
|
| 123 |
+
young = netG_B2A(tensor)
|
| 124 |
+
return (young.squeeze().detach().permute(1, 2, 0).numpy() + 1) / 2
|
| 125 |
+
|
| 126 |
+
# Face detection using OpenCV
|
| 127 |
+
def extract_faces_opencv(image):
|
| 128 |
+
gray = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)
|
| 129 |
+
face_cascade = cv2.CascadeClassifier(cv2.data.haarcascades + "haarcascade_frontalface_default.xml")
|
| 130 |
+
faces = face_cascade.detectMultiScale(gray, scaleFactor=1.1, minNeighbors=5)
|
| 131 |
+
|
| 132 |
+
face_crops = []
|
| 133 |
+
for (x, y, w, h) in faces:
|
| 134 |
+
y1, y2 = max(0, y - 50), min(image.shape[0], y + h)
|
| 135 |
+
x1, x2 = max(0, x - 30), min(image.shape[1], x + w + 30)
|
| 136 |
+
face_crop = image[y1:y2, x1:x2]
|
| 137 |
+
face_crops.append(face_crop)
|
| 138 |
+
return face_crops
|
main.py
ADDED
|
@@ -0,0 +1,56 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
from fastapi import FastAPI, File, UploadFile, Form
|
| 2 |
+
from fastapi.responses import HTMLResponse, FileResponse
|
| 3 |
+
from fastapi.staticfiles import StaticFiles
|
| 4 |
+
from fastapi.templating import Jinja2Templates
|
| 5 |
+
from fastapi.middleware.cors import CORSMiddleware
|
| 6 |
+
from fastapi.requests import Request
|
| 7 |
+
from helper import *
|
| 8 |
+
from PIL import Image
|
| 9 |
+
import numpy as np
|
| 10 |
+
import io
|
| 11 |
+
import cv2
|
| 12 |
+
import base64
|
| 13 |
+
|
| 14 |
+
app = FastAPI()
|
| 15 |
+
|
| 16 |
+
# CORS for JS frontend
|
| 17 |
+
app.add_middleware(
|
| 18 |
+
CORSMiddleware,
|
| 19 |
+
allow_origins=["*"],
|
| 20 |
+
allow_methods=["*"],
|
| 21 |
+
allow_headers=["*"]
|
| 22 |
+
)
|
| 23 |
+
|
| 24 |
+
app.mount("/static", StaticFiles(directory="static"), name="static")
|
| 25 |
+
templates = Jinja2Templates(directory="templates")
|
| 26 |
+
|
| 27 |
+
@app.get("/", response_class=HTMLResponse)
|
| 28 |
+
def root(request: Request):
|
| 29 |
+
return templates.TemplateResponse("index.html", {"request": request})
|
| 30 |
+
|
| 31 |
+
|
| 32 |
+
@app.post("/convert/")
|
| 33 |
+
async def convert_image(file: UploadFile = File(...), conversion: str = Form(...)):
|
| 34 |
+
contents = await file.read()
|
| 35 |
+
image = Image.open(io.BytesIO(contents)).convert("RGB")
|
| 36 |
+
image_np = np.array(image)
|
| 37 |
+
|
| 38 |
+
faces = extract_faces_opencv(image_np)
|
| 39 |
+
if not faces:
|
| 40 |
+
return {"error": "No face detected"}
|
| 41 |
+
|
| 42 |
+
face = cv2.resize(faces[0], (256, 256)) # process only 1 face for now
|
| 43 |
+
|
| 44 |
+
if conversion == "young_to_old":
|
| 45 |
+
result = generate_Y2O(face)
|
| 46 |
+
elif conversion == "old_to_young":
|
| 47 |
+
result = generate_O2Y(face)
|
| 48 |
+
else:
|
| 49 |
+
return {"error": "Invalid conversion type"}
|
| 50 |
+
|
| 51 |
+
# Convert to base64
|
| 52 |
+
result_img = (result * 255).astype(np.uint8)
|
| 53 |
+
_, buffer = cv2.imencode(".png", result_img[:, :, ::-1]) # BGR to RGB
|
| 54 |
+
base64_img = base64.b64encode(buffer).decode("utf-8")
|
| 55 |
+
|
| 56 |
+
return {"image": base64_img}
|
netG_A2B_epoch130.pth
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:992c0132e46cf2294651f9d5c827a9c28ddb101d9531cf232a46792da0a82eed
|
| 3 |
+
size 45851722
|
netG_B2A_epoch130.pth
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:f8299523a9f28f7d28e5ec6b4431c847f7a216b4c4a33daf474e3f70b7ea204f
|
| 3 |
+
size 45851722
|
requirements.txt
ADDED
|
@@ -0,0 +1,9 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
fastapi
|
| 2 |
+
uvicorn
|
| 3 |
+
python-multipart
|
| 4 |
+
jinja2
|
| 5 |
+
pillow
|
| 6 |
+
opencv-python
|
| 7 |
+
torch
|
| 8 |
+
numpy
|
| 9 |
+
torchvision
|
static/script.js
ADDED
|
@@ -0,0 +1,128 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
// ===== Form Submission with Backend Integration =====
|
| 2 |
+
document.getElementById("uploadForm").addEventListener("submit", async function (e) {
|
| 3 |
+
e.preventDefault();
|
| 4 |
+
|
| 5 |
+
const fileInput = document.getElementById("fileInput");
|
| 6 |
+
const conversionSelect = document.getElementById("conversion");
|
| 7 |
+
const outputImage = document.getElementById("outputImage");
|
| 8 |
+
const resultDiv = document.getElementById("result");
|
| 9 |
+
const generateBtn = document.getElementById("generateBtn");
|
| 10 |
+
|
| 11 |
+
if (!fileInput.files.length) return;
|
| 12 |
+
|
| 13 |
+
const file = fileInput.files[0];
|
| 14 |
+
const conversion = conversionSelect.value;
|
| 15 |
+
|
| 16 |
+
generateBtn.disabled = true;
|
| 17 |
+
generateBtn.innerHTML = '<i class="fas fa-spinner fa-spin mr-2"></i>Processing...';
|
| 18 |
+
|
| 19 |
+
const formData = new FormData();
|
| 20 |
+
formData.append("file", file);
|
| 21 |
+
formData.append("conversion", conversion);
|
| 22 |
+
|
| 23 |
+
try {
|
| 24 |
+
const response = await fetch("/convert/", {
|
| 25 |
+
method: "POST",
|
| 26 |
+
body: formData,
|
| 27 |
+
});
|
| 28 |
+
|
| 29 |
+
const data = await response.json();
|
| 30 |
+
|
| 31 |
+
if (data.image) {
|
| 32 |
+
outputImage.src = `data:image/png;base64,${data.image}`;
|
| 33 |
+
outputImage.alt = `Face image after ${conversion === "young_to_old" ? "aging" : "de-aging"} transformation`;
|
| 34 |
+
outputImage.style.display = "block";
|
| 35 |
+
resultDiv.scrollIntoView({ behavior: "smooth" });
|
| 36 |
+
} else {
|
| 37 |
+
alert(data.error || "An error occurred.");
|
| 38 |
+
}
|
| 39 |
+
} catch (error) {
|
| 40 |
+
console.error("Error:", error);
|
| 41 |
+
alert("Something went wrong. Please try again later.");
|
| 42 |
+
} finally {
|
| 43 |
+
generateBtn.disabled = false;
|
| 44 |
+
generateBtn.innerHTML = "Generate";
|
| 45 |
+
}
|
| 46 |
+
});
|
| 47 |
+
|
| 48 |
+
|
| 49 |
+
// ===== Dark Mode Toggle =====
|
| 50 |
+
const darkToggle = document.getElementById("darkToggle");
|
| 51 |
+
const body = document.body;
|
| 52 |
+
const darkIcon = darkToggle.querySelector("i");
|
| 53 |
+
|
| 54 |
+
function setDarkMode(enabled) {
|
| 55 |
+
if (enabled) {
|
| 56 |
+
body.classList.add("dark");
|
| 57 |
+
darkIcon.classList.replace("fa-moon", "fa-sun");
|
| 58 |
+
darkToggle.setAttribute("aria-label", "Toggle light mode");
|
| 59 |
+
darkToggle.setAttribute("title", "Toggle light mode");
|
| 60 |
+
} else {
|
| 61 |
+
body.classList.remove("dark");
|
| 62 |
+
darkIcon.classList.replace("fa-sun", "fa-moon");
|
| 63 |
+
darkToggle.setAttribute("aria-label", "Toggle dark mode");
|
| 64 |
+
darkToggle.setAttribute("title", "Toggle dark mode");
|
| 65 |
+
}
|
| 66 |
+
localStorage.setItem("faceAgingDarkMode", enabled ? "true" : "false");
|
| 67 |
+
}
|
| 68 |
+
|
| 69 |
+
|
| 70 |
+
|
| 71 |
+
|
| 72 |
+
// ===== Mobile Menu Toggle =====
|
| 73 |
+
const mobileMenuButton = document.getElementById("mobileMenuButton");
|
| 74 |
+
const mobileMenu = document.getElementById("mobileMenu");
|
| 75 |
+
|
| 76 |
+
mobileMenuButton.addEventListener("click", () => {
|
| 77 |
+
const expanded = mobileMenuButton.getAttribute("aria-expanded") === "true";
|
| 78 |
+
mobileMenuButton.setAttribute("aria-expanded", !expanded);
|
| 79 |
+
mobileMenu.classList.toggle("hidden");
|
| 80 |
+
});
|
| 81 |
+
|
| 82 |
+
|
| 83 |
+
// ===== Header Shadow on Scroll =====
|
| 84 |
+
const header = document.getElementById("header");
|
| 85 |
+
window.addEventListener("scroll", () => {
|
| 86 |
+
if (window.scrollY > 10) {
|
| 87 |
+
header.classList.add("scrolled");
|
| 88 |
+
} else {
|
| 89 |
+
header.classList.remove("scrolled");
|
| 90 |
+
}
|
| 91 |
+
});
|
| 92 |
+
|
| 93 |
+
|
| 94 |
+
// ===== Scroll Animations =====
|
| 95 |
+
const scrollElements = document.querySelectorAll(".scroll-animate, .fade-in");
|
| 96 |
+
const scrollObserver = new IntersectionObserver(
|
| 97 |
+
(entries) => {
|
| 98 |
+
entries.forEach((entry) => {
|
| 99 |
+
if (entry.isIntersecting) {
|
| 100 |
+
entry.target.classList.add("visible");
|
| 101 |
+
scrollObserver.unobserve(entry.target);
|
| 102 |
+
}
|
| 103 |
+
});
|
| 104 |
+
},
|
| 105 |
+
{ threshold: 0.15 }
|
| 106 |
+
);
|
| 107 |
+
scrollElements.forEach((el) => scrollObserver.observe(el));
|
| 108 |
+
|
| 109 |
+
|
| 110 |
+
// ===== Learn More Toggle =====
|
| 111 |
+
const learnMoreBtn = document.getElementById("learnMoreBtn");
|
| 112 |
+
const learnMoreContent = document.getElementById("learnMoreContent");
|
| 113 |
+
|
| 114 |
+
learnMoreBtn.addEventListener("click", () => {
|
| 115 |
+
const isOpen = learnMoreContent.classList.toggle("open");
|
| 116 |
+
learnMoreContent.hidden = !isOpen;
|
| 117 |
+
learnMoreBtn.setAttribute("aria-expanded", isOpen);
|
| 118 |
+
learnMoreBtn.textContent = isOpen ? "Show Less" : "Learn More";
|
| 119 |
+
});
|
| 120 |
+
|
| 121 |
+
|
| 122 |
+
// ===== Hero Upload Button Triggers File Picker =====
|
| 123 |
+
const uploadBtnHero = document.getElementById("uploadBtnHero");
|
| 124 |
+
const fileInput = document.getElementById("fileInput");
|
| 125 |
+
|
| 126 |
+
uploadBtnHero.addEventListener("click", () => {
|
| 127 |
+
fileInput.click();
|
| 128 |
+
});
|
static/style.css
ADDED
|
@@ -0,0 +1,20 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
body {
|
| 2 |
+
font-family: Arial, sans-serif;
|
| 3 |
+
background: #111;
|
| 4 |
+
color: #eee;
|
| 5 |
+
text-align: center;
|
| 6 |
+
padding: 40px;
|
| 7 |
+
}
|
| 8 |
+
|
| 9 |
+
.container {
|
| 10 |
+
background: #222;
|
| 11 |
+
padding: 30px;
|
| 12 |
+
border-radius: 10px;
|
| 13 |
+
display: inline-block;
|
| 14 |
+
}
|
| 15 |
+
|
| 16 |
+
input, select, button {
|
| 17 |
+
margin: 10px;
|
| 18 |
+
padding: 10px;
|
| 19 |
+
font-size: 1em;
|
| 20 |
+
}
|
templates/index.html
ADDED
|
@@ -0,0 +1,775 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
<html class="scroll-smooth" lang="en">
|
| 2 |
+
|
| 3 |
+
<head>
|
| 4 |
+
<meta charset="utf-8" />
|
| 5 |
+
<meta content="width=device-width, initial-scale=1" name="viewport" />
|
| 6 |
+
<title>FaceAging AI - AI-Powered Face Age Transformation</title>
|
| 7 |
+
<script src="https://cdn.tailwindcss.com"></script>
|
| 8 |
+
<link href="https://fonts.googleapis.com/css2?family=Inter:wght@400;600;700&display=swap" rel="stylesheet" />
|
| 9 |
+
<link href="https://cdnjs.cloudflare.com/ajax/libs/font-awesome/5.15.3/css/all.min.css" rel="stylesheet" />
|
| 10 |
+
<style>
|
| 11 |
+
html {
|
| 12 |
+
scroll-behavior: smooth;
|
| 13 |
+
}
|
| 14 |
+
|
| 15 |
+
body {
|
| 16 |
+
font-family: "Inter", sans-serif;
|
| 17 |
+
background-color: #faf9f6;
|
| 18 |
+
color: #3e2723;
|
| 19 |
+
transition: background-color 0.4s ease, color 0.4s ease;
|
| 20 |
+
min-height: 100vh;
|
| 21 |
+
}
|
| 22 |
+
|
| 23 |
+
body.dark {
|
| 24 |
+
background-color: #6d4c41;
|
| 25 |
+
color: #d7ccc8;
|
| 26 |
+
}
|
| 27 |
+
|
| 28 |
+
::-webkit-scrollbar {
|
| 29 |
+
width: 8px;
|
| 30 |
+
height: 8px;
|
| 31 |
+
}
|
| 32 |
+
|
| 33 |
+
::-webkit-scrollbar-track {
|
| 34 |
+
background: transparent;
|
| 35 |
+
}
|
| 36 |
+
|
| 37 |
+
::-webkit-scrollbar-thumb {
|
| 38 |
+
background-color: #a1887f;
|
| 39 |
+
border-radius: 4px;
|
| 40 |
+
}
|
| 41 |
+
|
| 42 |
+
body.dark ::-webkit-scrollbar-thumb {
|
| 43 |
+
background-color: #5d4037;
|
| 44 |
+
}
|
| 45 |
+
|
| 46 |
+
header.scrolled {
|
| 47 |
+
box-shadow: 0 2px 8px rgb(0 0 0 / 0.15);
|
| 48 |
+
backdrop-filter: saturate(180%) blur(10px);
|
| 49 |
+
background-color: rgba(250, 249, 246, 0.85);
|
| 50 |
+
transition: background-color 0.3s ease, box-shadow 0.3s ease;
|
| 51 |
+
}
|
| 52 |
+
|
| 53 |
+
body.dark header.scrolled {
|
| 54 |
+
background-color: rgba(109, 76, 65, 0.9);
|
| 55 |
+
box-shadow: 0 2px 12px rgb(0 0 0 / 0.5);
|
| 56 |
+
}
|
| 57 |
+
|
| 58 |
+
#darkToggle {
|
| 59 |
+
cursor: pointer;
|
| 60 |
+
transition: transform 0.3s ease, color 0.3s ease;
|
| 61 |
+
color: #3e2723;
|
| 62 |
+
}
|
| 63 |
+
|
| 64 |
+
#darkToggle:hover,
|
| 65 |
+
#darkToggle:focus {
|
| 66 |
+
transform: scale(1.1);
|
| 67 |
+
outline: none;
|
| 68 |
+
color: #ff7043;
|
| 69 |
+
}
|
| 70 |
+
|
| 71 |
+
body.dark #darkToggle {
|
| 72 |
+
color: #d7ccc8;
|
| 73 |
+
}
|
| 74 |
+
|
| 75 |
+
body.dark #darkToggle:hover,
|
| 76 |
+
body.dark #darkToggle:focus {
|
| 77 |
+
color: #ffccbc;
|
| 78 |
+
}
|
| 79 |
+
|
| 80 |
+
.fade-in {
|
| 81 |
+
opacity: 0;
|
| 82 |
+
transform: translateY(20px);
|
| 83 |
+
transition: opacity 0.8s ease, transform 0.8s ease;
|
| 84 |
+
}
|
| 85 |
+
|
| 86 |
+
.fade-in.visible {
|
| 87 |
+
opacity: 1;
|
| 88 |
+
transform: translateY(0);
|
| 89 |
+
}
|
| 90 |
+
|
| 91 |
+
.scroll-animate {
|
| 92 |
+
opacity: 0;
|
| 93 |
+
transform: translateY(30px);
|
| 94 |
+
transition: opacity 0.8s ease, transform 0.8s ease;
|
| 95 |
+
}
|
| 96 |
+
|
| 97 |
+
.scroll-animate.visible {
|
| 98 |
+
opacity: 1;
|
| 99 |
+
transform: translateY(0);
|
| 100 |
+
}
|
| 101 |
+
|
| 102 |
+
button.primary-btn {
|
| 103 |
+
background-color: #6d4c41;
|
| 104 |
+
color: #faf9f6;
|
| 105 |
+
box-shadow: 0 4px 8px rgb(109 76 65 / 0.5);
|
| 106 |
+
transition: background-color 0.3s ease, box-shadow 0.3s ease,
|
| 107 |
+
transform 0.15s ease, color 0.3s ease;
|
| 108 |
+
border-radius: 0.5rem;
|
| 109 |
+
font-weight: 600;
|
| 110 |
+
}
|
| 111 |
+
|
| 112 |
+
button.primary-btn:hover,
|
| 113 |
+
button.primary-btn:focus {
|
| 114 |
+
background-color: #4e342e;
|
| 115 |
+
box-shadow: 0 6px 12px rgb(78 52 46 / 0.7);
|
| 116 |
+
outline: none;
|
| 117 |
+
transform: scale(1.05);
|
| 118 |
+
color: #ffccbc;
|
| 119 |
+
}
|
| 120 |
+
|
| 121 |
+
.feature-card,
|
| 122 |
+
.step-card {
|
| 123 |
+
background-color: #ffccbc;
|
| 124 |
+
border-radius: 1rem;
|
| 125 |
+
box-shadow: 0 2px 6px rgb(0 0 0 / 0.1);
|
| 126 |
+
transition: box-shadow 0.3s ease, transform 0.3s ease,
|
| 127 |
+
background-color 0.3s ease;
|
| 128 |
+
cursor: default;
|
| 129 |
+
color: #4e342e;
|
| 130 |
+
}
|
| 131 |
+
|
| 132 |
+
.feature-card:hover,
|
| 133 |
+
.feature-card:focus-within,
|
| 134 |
+
.step-card:hover,
|
| 135 |
+
.step-card:focus-within {
|
| 136 |
+
box-shadow: 0 8px 20px rgb(109 76 65 / 0.3);
|
| 137 |
+
transform: translateY(-6px);
|
| 138 |
+
outline: none;
|
| 139 |
+
background-color: #ffab91;
|
| 140 |
+
color: #3e2723;
|
| 141 |
+
}
|
| 142 |
+
|
| 143 |
+
.image-container {
|
| 144 |
+
border-radius: 1rem;
|
| 145 |
+
box-shadow: 0 8px 20px rgb(0 0 0 / 0.1);
|
| 146 |
+
overflow: hidden;
|
| 147 |
+
background-color: #fff7f3;
|
| 148 |
+
position: relative;
|
| 149 |
+
}
|
| 150 |
+
|
| 151 |
+
.image-container img {
|
| 152 |
+
opacity: 0;
|
| 153 |
+
transition: opacity 1s ease;
|
| 154 |
+
display: block;
|
| 155 |
+
width: 100%;
|
| 156 |
+
height: auto;
|
| 157 |
+
user-select: none;
|
| 158 |
+
pointer-events: none;
|
| 159 |
+
border-radius: 1rem;
|
| 160 |
+
}
|
| 161 |
+
|
| 162 |
+
.image-container img.loaded {
|
| 163 |
+
opacity: 1;
|
| 164 |
+
}
|
| 165 |
+
|
| 166 |
+
.spinner {
|
| 167 |
+
border: 4px solid #e0d7d1;
|
| 168 |
+
border-top: 4px solid #6d4c41;
|
| 169 |
+
border-radius: 50%;
|
| 170 |
+
width: 48px;
|
| 171 |
+
height: 48px;
|
| 172 |
+
animation: spin 1s linear infinite;
|
| 173 |
+
margin: auto;
|
| 174 |
+
}
|
| 175 |
+
|
| 176 |
+
@keyframes spin {
|
| 177 |
+
0% {
|
| 178 |
+
transform: rotate(0deg);
|
| 179 |
+
}
|
| 180 |
+
|
| 181 |
+
100% {
|
| 182 |
+
transform: rotate(360deg);
|
| 183 |
+
}
|
| 184 |
+
}
|
| 185 |
+
|
| 186 |
+
.upload-area {
|
| 187 |
+
border: 2px dashed #d7ccc8;
|
| 188 |
+
border-radius: 1rem;
|
| 189 |
+
background-color: #fff7f3;
|
| 190 |
+
transition: background-color 0.3s ease, border-color 0.3s ease,
|
| 191 |
+
color 0.3s ease;
|
| 192 |
+
cursor: pointer;
|
| 193 |
+
display: flex;
|
| 194 |
+
flex-direction: column;
|
| 195 |
+
align-items: center;
|
| 196 |
+
justify-content: center;
|
| 197 |
+
padding: 3rem 1.5rem;
|
| 198 |
+
text-align: center;
|
| 199 |
+
color: #6d4c41;
|
| 200 |
+
user-select: none;
|
| 201 |
+
outline-offset: 4px;
|
| 202 |
+
}
|
| 203 |
+
|
| 204 |
+
.upload-area:hover,
|
| 205 |
+
.upload-area.dragover {
|
| 206 |
+
background-color: #ffebe6;
|
| 207 |
+
border-color: #6d4c41;
|
| 208 |
+
color: #4e342e;
|
| 209 |
+
}
|
| 210 |
+
|
| 211 |
+
body.dark .upload-area {
|
| 212 |
+
background-color: #4e342e;
|
| 213 |
+
color: #ffccbc;
|
| 214 |
+
border-color: #a1887f;
|
| 215 |
+
}
|
| 216 |
+
|
| 217 |
+
body.dark .upload-area:hover,
|
| 218 |
+
body.dark .upload-area.dragover {
|
| 219 |
+
background-color: #3e2723;
|
| 220 |
+
border-color: #ffab91;
|
| 221 |
+
color: #ffccbc;
|
| 222 |
+
}
|
| 223 |
+
|
| 224 |
+
input[type="file"] {
|
| 225 |
+
display: none;
|
| 226 |
+
}
|
| 227 |
+
|
| 228 |
+
.upload-label {
|
| 229 |
+
font-weight: 600;
|
| 230 |
+
font-size: 1.125rem;
|
| 231 |
+
margin-top: 1rem;
|
| 232 |
+
color: inherit;
|
| 233 |
+
user-select: none;
|
| 234 |
+
}
|
| 235 |
+
|
| 236 |
+
.before-after-container {
|
| 237 |
+
position: relative;
|
| 238 |
+
overflow: hidden;
|
| 239 |
+
border-radius: 1rem;
|
| 240 |
+
box-shadow: 0 8px 24px rgb(0 0 0 / 0.1);
|
| 241 |
+
background-color: #fff7f3;
|
| 242 |
+
max-width: 100%;
|
| 243 |
+
user-select: none;
|
| 244 |
+
display: flex;
|
| 245 |
+
gap: 1rem;
|
| 246 |
+
justify-content: center;
|
| 247 |
+
align-items: center;
|
| 248 |
+
flex-wrap: wrap;
|
| 249 |
+
}
|
| 250 |
+
|
| 251 |
+
.before-image,
|
| 252 |
+
.after-image {
|
| 253 |
+
display: block;
|
| 254 |
+
width: 48%;
|
| 255 |
+
height: auto;
|
| 256 |
+
pointer-events: none;
|
| 257 |
+
user-select: none;
|
| 258 |
+
border-radius: 1rem;
|
| 259 |
+
box-shadow: 0 4px 12px rgb(0 0 0 / 0.1);
|
| 260 |
+
transition: opacity 1s ease;
|
| 261 |
+
}
|
| 262 |
+
|
| 263 |
+
@media (max-width: 768px) {
|
| 264 |
+
.before-after-container {
|
| 265 |
+
flex-direction: column;
|
| 266 |
+
}
|
| 267 |
+
|
| 268 |
+
.before-image,
|
| 269 |
+
.after-image {
|
| 270 |
+
width: 100%;
|
| 271 |
+
}
|
| 272 |
+
|
| 273 |
+
.upload-area {
|
| 274 |
+
padding: 2rem 1rem;
|
| 275 |
+
}
|
| 276 |
+
}
|
| 277 |
+
|
| 278 |
+
@media (max-width: 480px) {
|
| 279 |
+
.before-after-container {
|
| 280 |
+
max-height: none;
|
| 281 |
+
}
|
| 282 |
+
|
| 283 |
+
.upload-area {
|
| 284 |
+
padding: 1.5rem 1rem;
|
| 285 |
+
}
|
| 286 |
+
}
|
| 287 |
+
|
| 288 |
+
.features-grid,
|
| 289 |
+
.steps-grid {
|
| 290 |
+
display: grid;
|
| 291 |
+
grid-template-columns: repeat(auto-fit, minmax(220px, 1fr));
|
| 292 |
+
gap: 1.5rem;
|
| 293 |
+
}
|
| 294 |
+
|
| 295 |
+
#learnMoreContent {
|
| 296 |
+
max-height: 0;
|
| 297 |
+
overflow: hidden;
|
| 298 |
+
transition: max-height 0.5s ease;
|
| 299 |
+
color: #6d4c41;
|
| 300 |
+
}
|
| 301 |
+
|
| 302 |
+
#learnMoreContent.open {
|
| 303 |
+
max-height: 500px;
|
| 304 |
+
}
|
| 305 |
+
|
| 306 |
+
.learn-more-btn {
|
| 307 |
+
color: #6d4c41;
|
| 308 |
+
font-weight: 600;
|
| 309 |
+
cursor: pointer;
|
| 310 |
+
user-select: none;
|
| 311 |
+
transition: color 0.3s ease;
|
| 312 |
+
}
|
| 313 |
+
|
| 314 |
+
.learn-more-btn:hover,
|
| 315 |
+
.learn-more-btn:focus {
|
| 316 |
+
color: #4e342e;
|
| 317 |
+
outline: none;
|
| 318 |
+
}
|
| 319 |
+
|
| 320 |
+
ul li a {
|
| 321 |
+
transition: color 0.3s ease;
|
| 322 |
+
}
|
| 323 |
+
|
| 324 |
+
ul li a:hover,
|
| 325 |
+
ul li a:focus {
|
| 326 |
+
color: #6d4c41;
|
| 327 |
+
outline: none;
|
| 328 |
+
}
|
| 329 |
+
|
| 330 |
+
footer {
|
| 331 |
+
color: #6d4c41;
|
| 332 |
+
}
|
| 333 |
+
|
| 334 |
+
footer a {
|
| 335 |
+
color: #6d4c41;
|
| 336 |
+
transition: color 0.3s ease;
|
| 337 |
+
}
|
| 338 |
+
|
| 339 |
+
footer a:hover,
|
| 340 |
+
footer a:focus {
|
| 341 |
+
color: #4e342e;
|
| 342 |
+
outline: none;
|
| 343 |
+
}
|
| 344 |
+
|
| 345 |
+
form#uploadForm {
|
| 346 |
+
margin-top: 1.5rem;
|
| 347 |
+
display: flex;
|
| 348 |
+
flex-direction: column;
|
| 349 |
+
align-items: center;
|
| 350 |
+
gap: 1rem;
|
| 351 |
+
max-width: 320px;
|
| 352 |
+
margin-left: auto;
|
| 353 |
+
margin-right: auto;
|
| 354 |
+
}
|
| 355 |
+
|
| 356 |
+
form#uploadForm select,
|
| 357 |
+
form#uploadForm button {
|
| 358 |
+
width: 100%;
|
| 359 |
+
border-radius: 0.5rem;
|
| 360 |
+
border: 1.5px solid #6d4c41;
|
| 361 |
+
padding: 0.5rem 1rem;
|
| 362 |
+
font-size: 1rem;
|
| 363 |
+
font-weight: 600;
|
| 364 |
+
color: #3e2723;
|
| 365 |
+
background-color: #fff7f3;
|
| 366 |
+
transition: background-color 0.3s ease, border-color 0.3s ease;
|
| 367 |
+
cursor: pointer;
|
| 368 |
+
}
|
| 369 |
+
|
| 370 |
+
form#uploadForm select:focus,
|
| 371 |
+
form#uploadForm button:focus {
|
| 372 |
+
outline: none;
|
| 373 |
+
border-color: #4e342e;
|
| 374 |
+
background-color: #ffebe6;
|
| 375 |
+
}
|
| 376 |
+
|
| 377 |
+
form#uploadForm button {
|
| 378 |
+
background-color: #6d4c41;
|
| 379 |
+
color: #faf9f6;
|
| 380 |
+
box-shadow: 0 4px 8px rgb(109 76 65 / 0.5);
|
| 381 |
+
transition: background-color 0.3s ease, box-shadow 0.3s ease,
|
| 382 |
+
transform 0.15s ease, color 0.3s ease;
|
| 383 |
+
}
|
| 384 |
+
|
| 385 |
+
form#uploadForm button:hover,
|
| 386 |
+
form#uploadForm button:focus {
|
| 387 |
+
background-color: #4e342e;
|
| 388 |
+
box-shadow: 0 6px 12px rgb(78 52 46 / 0.7);
|
| 389 |
+
color: #ffccbc;
|
| 390 |
+
transform: scale(1.05);
|
| 391 |
+
outline: none;
|
| 392 |
+
}
|
| 393 |
+
|
| 394 |
+
form#uploadForm input[type="file"] {
|
| 395 |
+
cursor: pointer;
|
| 396 |
+
border-radius: 0.5rem;
|
| 397 |
+
border: 1.5px solid #6d4c41;
|
| 398 |
+
padding: 0.5rem 1rem;
|
| 399 |
+
background-color: #fff7f3;
|
| 400 |
+
color: #3e2723;
|
| 401 |
+
font-weight: 600;
|
| 402 |
+
transition: background-color 0.3s ease, border-color 0.3s ease;
|
| 403 |
+
width: 100%;
|
| 404 |
+
display: block;
|
| 405 |
+
}
|
| 406 |
+
|
| 407 |
+
form#uploadForm input[type="file"]:focus {
|
| 408 |
+
outline: none;
|
| 409 |
+
border-color: #4e342e;
|
| 410 |
+
background-color: #ffebe6;
|
| 411 |
+
}
|
| 412 |
+
|
| 413 |
+
#result {
|
| 414 |
+
margin-top: 2rem;
|
| 415 |
+
text-align: center;
|
| 416 |
+
color: #3e2723;
|
| 417 |
+
}
|
| 418 |
+
|
| 419 |
+
#result h2 {
|
| 420 |
+
font-weight: 700;
|
| 421 |
+
font-size: 1.5rem;
|
| 422 |
+
margin-bottom: 1rem;
|
| 423 |
+
}
|
| 424 |
+
|
| 425 |
+
#outputImage {
|
| 426 |
+
max-width: 100%;
|
| 427 |
+
border-radius: 1rem;
|
| 428 |
+
box-shadow: 0 8px 20px rgb(0 0 0 / 0.1);
|
| 429 |
+
display: none;
|
| 430 |
+
margin: 0 auto;
|
| 431 |
+
}
|
| 432 |
+
|
| 433 |
+
body.dark form#uploadForm select,
|
| 434 |
+
body.dark form#uploadForm button,
|
| 435 |
+
body.dark form#uploadForm input[type="file"] {
|
| 436 |
+
background-color: #4e342e;
|
| 437 |
+
color: #ffccbc;
|
| 438 |
+
border-color: #a1887f;
|
| 439 |
+
}
|
| 440 |
+
|
| 441 |
+
body.dark form#uploadForm select:focus,
|
| 442 |
+
body.dark form#uploadForm button:focus,
|
| 443 |
+
body.dark form#uploadForm input[type="file"]:focus {
|
| 444 |
+
background-color: #3e2723;
|
| 445 |
+
border-color: #ffab91;
|
| 446 |
+
color: #ffccbc;
|
| 447 |
+
}
|
| 448 |
+
|
| 449 |
+
body.dark #result {
|
| 450 |
+
color: #d7ccc8;
|
| 451 |
+
}
|
| 452 |
+
</style>
|
| 453 |
+
</head>
|
| 454 |
+
|
| 455 |
+
<body class="relative flex flex-col min-h-screen transition-colors duration-500">
|
| 456 |
+
<!-- HEADER -->
|
| 457 |
+
<header class="fixed top-0 left-0 right-0 z-50 border-b border-transparent backdrop-blur-sm" id="header"
|
| 458 |
+
role="banner" style="background-color: rgba(250, 249, 246, 0.85)">
|
| 459 |
+
<nav aria-label="Primary Navigation"
|
| 460 |
+
class="max-w-7xl mx-auto flex items-center justify-between px-6 sm:px-8 lg:px-12 h-16">
|
| 461 |
+
<a aria-label="FaceAging AI Home" class="flex items-center space-x-3 font-semibold text-xl select-none"
|
| 462 |
+
href="#" style="color: #3e2723">
|
| 463 |
+
<img alt="FaceAging AI logo, stylized FA letters in circle" aria-hidden="true"
|
| 464 |
+
class="w-10 h-10 rounded-full" draggable="false" height="40"
|
| 465 |
+
src="https://storage.googleapis.com/a1aa/image/02d7a33c-afc0-4d71-885c-1ebc483270ab.jpg"
|
| 466 |
+
width="40" />
|
| 467 |
+
<span class="font-inter font-bold tracking-wide">FaceAging AI</span>
|
| 468 |
+
</a>
|
| 469 |
+
<ul class="hidden md:flex space-x-10 font-medium text-brown" role="menubar" style="color: #3e2723">
|
| 470 |
+
<li role="none">
|
| 471 |
+
<a class="hover:text-[#6D4C41] focus:outline-none focus:text-[#6D4C41]" href="#about"
|
| 472 |
+
role="menuitem" tabindex="0">About</a>
|
| 473 |
+
</li>
|
| 474 |
+
<li role="none">
|
| 475 |
+
<a class="hover:text-[#6D4C41] focus:outline-none focus:text-[#6D4C41]" href="#features"
|
| 476 |
+
role="menuitem" tabindex="0">Features</a>
|
| 477 |
+
</li>
|
| 478 |
+
<li role="none">
|
| 479 |
+
<a class="hover:text-[#6D4C41] focus:outline-none focus:text-[#6D4C41]" href="#tryItOut"
|
| 480 |
+
role="menuitem" tabindex="0">Try It Out</a>
|
| 481 |
+
</li>
|
| 482 |
+
<li role="none">
|
| 483 |
+
<a class="hover:text-[#6D4C41] focus:outline-none focus:text-[#6D4C41]" href="#contact"
|
| 484 |
+
role="menuitem" tabindex="0">Contact</a>
|
| 485 |
+
</li>
|
| 486 |
+
</ul>
|
| 487 |
+
<div class="flex items-center space-x-4">
|
| 488 |
+
<button aria-label="Toggle dark mode"
|
| 489 |
+
class="focus:outline-none focus:ring-2 focus:ring-[#FF7043] rounded p-1 " id="darkToggle"
|
| 490 |
+
title="Toggle dark mode" type="button" style="color: #3e2723; visibility: hidden;">
|
| 491 |
+
<i class="fas fa-moon fa-lg"></i>
|
| 492 |
+
</button>
|
| 493 |
+
<!-- Mobile menu button -->
|
| 494 |
+
<button aria-controls="mobileMenu" aria-expanded="false" aria-label="Toggle menu"
|
| 495 |
+
class="md:hidden focus:outline-none focus:ring-2 focus:ring-[#FF7043] rounded p-1"
|
| 496 |
+
id="mobileMenuButton" type="button" style="color: #3e2723">
|
| 497 |
+
<i class="fas fa-bars fa-lg"></i>
|
| 498 |
+
</button>
|
| 499 |
+
</div>
|
| 500 |
+
</nav>
|
| 501 |
+
<!-- Mobile menu -->
|
| 502 |
+
<div aria-label="Mobile Navigation" class="hidden md:hidden border-t border-transparent" id="mobileMenu"
|
| 503 |
+
role="menu" style="background-color: rgba(250, 249, 246, 0.95)">
|
| 504 |
+
<ul class="flex flex-col space-y-2 p-4 font-medium" style="color: #3e2723">
|
| 505 |
+
<li role="none">
|
| 506 |
+
<a class="block px-3 py-2 rounded hover:bg-[#FFEBE6] hover:text-[#6D4C41] focus:outline-none focus:bg-[#FFEBE6] focus:text-[#6D4C41] transition"
|
| 507 |
+
href="#about" role="menuitem" tabindex="0">About</a>
|
| 508 |
+
</li>
|
| 509 |
+
<li role="none">
|
| 510 |
+
<a class="block px-3 py-2 rounded hover:bg-[#FFEBE6] hover:text-[#6D4C41] focus:outline-none focus:bg-[#FFEBE6] focus:text-[#6D4C41] transition"
|
| 511 |
+
href="#features" role="menuitem" tabindex="0">Features</a>
|
| 512 |
+
</li>
|
| 513 |
+
<li role="none">
|
| 514 |
+
<a class="block px-3 py-2 rounded hover:bg-[#FFEBE6] hover:text-[#6D4C41] focus:outline-none focus:bg-[#FFEBE6] focus:text-[#6D4C41] transition"
|
| 515 |
+
href="#tryItOut" role="menuitem" tabindex="0">Try It Out</a>
|
| 516 |
+
</li>
|
| 517 |
+
<li role="none">
|
| 518 |
+
<a class="block px-3 py-2 rounded hover:bg-[#FFEBE6] hover:text-[#6D4C41] focus:outline-none focus:bg-[#FFEBE6] focus:text-[#6D4C41] transition"
|
| 519 |
+
href="#contact" role="menuitem" tabindex="0">Contact</a>
|
| 520 |
+
</li>
|
| 521 |
+
</ul>
|
| 522 |
+
</div>
|
| 523 |
+
</header>
|
| 524 |
+
<!-- MAIN CONTENT -->
|
| 525 |
+
<main class="flex-grow pt-20">
|
| 526 |
+
<!-- HERO -->
|
| 527 |
+
<section class="max-w-5xl mx-auto px-6 sm:px-8 lg:px-12 text-center py-20 fade-in" id="hero" tabindex="-1">
|
| 528 |
+
<h1 class="text-4xl sm:text-5xl font-extrabold leading-tight max-w-3xl mx-auto" style="color: #3e2723">
|
| 529 |
+
FaceAging AI
|
| 530 |
+
</h1>
|
| 531 |
+
<p class="mt-4 text-lg sm:text-xl max-w-2xl mx-auto" style="color: #6d4c41">
|
| 532 |
+
Realistic AI-powered face age transformation — look older or younger
|
| 533 |
+
instantly.
|
| 534 |
+
</p>
|
| 535 |
+
<button aria-label="Upload Image"
|
| 536 |
+
class="primary-btn mt-10 px-8 py-4 text-lg rounded-lg shadow-lg focus:outline-none focus:ring-4 focus:ring-[#FF7043]"
|
| 537 |
+
id="uploadBtnHero" type="button">
|
| 538 |
+
<i class="fas fa-upload mr-3"></i> Upload Image
|
| 539 |
+
</button>
|
| 540 |
+
</section>
|
| 541 |
+
|
| 542 |
+
|
| 543 |
+
<!-- TRY IT OUT -->
|
| 544 |
+
<section aria-label="Try FaceAging AI"
|
| 545 |
+
class="max-w-full mx-auto px-12 sm:px-20 lg:px-28 py-20 scroll-animate"
|
| 546 |
+
id="tryItOut" tabindex="-1">
|
| 547 |
+
|
| 548 |
+
<form class="space-y-10 max-w-5xl mx-auto" id="uploadForm" enctype="multipart/form-data" novalidate>
|
| 549 |
+
|
| 550 |
+
<!-- Upload Box -->
|
| 551 |
+
<div class="border-2 border-dashed border-[#A1887F] rounded-3xl p-14 text-center bg-[#EFEBE9] shadow-md">
|
| 552 |
+
<label for="fileInput" class="block text-3xl font-semibold text-[#3E2723] mb-6">
|
| 553 |
+
Upload Your Face Image
|
| 554 |
+
</label>
|
| 555 |
+
<input accept="image/*" class="block mx-auto text-lg mb-5 w-full max-w-4xl" id="fileInput" name="file" required
|
| 556 |
+
type="file" aria-label="Upload face image" />
|
| 557 |
+
<p class="text-base text-[#5D4037]">Supported formats: JPG, PNG • Max size: 5MB</p>
|
| 558 |
+
</div>
|
| 559 |
+
|
| 560 |
+
<!-- Controls Section -->
|
| 561 |
+
<div class="flex flex-col sm:flex-row items-center justify-center gap-10 max-w-5xl mx-auto">
|
| 562 |
+
|
| 563 |
+
<!-- Dropdown -->
|
| 564 |
+
<div class="w-full sm:w-1/2">
|
| 565 |
+
<label for="conversion" class="block text-lg font-medium text-[#3E2723] mb-3">
|
| 566 |
+
Choose Transformation
|
| 567 |
+
</label>
|
| 568 |
+
<select id="conversion" name="conversion" required
|
| 569 |
+
class="w-full px-6 py-4 rounded-3xl border border-[#A1887F] bg-white text-[#3E2723] text-lg shadow-sm focus:outline-none focus:ring-4 focus:ring-[#6D4C41] transition">
|
| 570 |
+
<option value="young_to_old">Young to Old</option>
|
| 571 |
+
<option value="old_to_young">Old to Young</option>
|
| 572 |
+
</select>
|
| 573 |
+
</div>
|
| 574 |
+
|
| 575 |
+
<!-- Generate Button -->
|
| 576 |
+
<div class="w-full sm:w-1/3 mt-6 sm:mt-0">
|
| 577 |
+
<button id="generateBtn" type="submit"
|
| 578 |
+
class="primary-btn w-full sm:w-auto px-8 py-4 bg-[#6D4C41] text-white font-semibold text-xl rounded-3xl shadow-md hover:bg-[#5D4037] transition focus:outline-none focus:ring-6 focus:ring-[#6D4C41]"
|
| 579 |
+
aria-label="Generate aged or de-aged face image">
|
| 580 |
+
Generate
|
| 581 |
+
</button>
|
| 582 |
+
</div>
|
| 583 |
+
|
| 584 |
+
</div>
|
| 585 |
+
</form>
|
| 586 |
+
|
| 587 |
+
<!-- Result -->
|
| 588 |
+
<div id="result" class="mt-20 text-center max-w-6xl mx-auto">
|
| 589 |
+
<h2 class="text-3xl font-bold mb-8 text-[#3e2723]">Result:</h2>
|
| 590 |
+
<img alt="Result Image showing the face after aging or de-aging transformation"
|
| 591 |
+
class="mx-auto rounded-3xl shadow-lg max-w-full" id="outputImage" src="#" style="display: none" loading="lazy" />
|
| 592 |
+
</div>
|
| 593 |
+
</section>
|
| 594 |
+
|
| 595 |
+
|
| 596 |
+
|
| 597 |
+
|
| 598 |
+
<!-- AI INSIGHT PANEL -->
|
| 599 |
+
<section aria-label="AI Insight Panel"
|
| 600 |
+
class="max-w-5xl mx-auto px-6 sm:px-8 lg:px-12 py-16 scroll-animate bg-[#FFCCBC] rounded-3xl shadow-lg mt-20"
|
| 601 |
+
id="aiInsight" tabindex="-1" style="color: #4e342e">
|
| 602 |
+
<h2 class="text-3xl font-extrabold mb-6 select-none" style="color: #6d4c41">
|
| 603 |
+
AI Insight
|
| 604 |
+
</h2>
|
| 605 |
+
<p class="max-w-3xl leading-relaxed">
|
| 606 |
+
FaceAging AI leverages state-of-the-art deep learning models trained on
|
| 607 |
+
diverse datasets to realistically simulate aging and de-aging effects on
|
| 608 |
+
facial images. Our proprietary algorithms ensure natural,
|
| 609 |
+
high-fidelity transformations while preserving your unique features.
|
| 610 |
+
</p>
|
| 611 |
+
<button aria-controls="learnMoreContent" aria-expanded="false"
|
| 612 |
+
class="mt-4 learn-more-btn font-semibold rounded focus:ring-2 focus:ring-[#6D4C41]" id="learnMoreBtn"
|
| 613 |
+
type="button">
|
| 614 |
+
Learn More
|
| 615 |
+
</button>
|
| 616 |
+
<div class="mt-4 max-w-3xl overflow-hidden" hidden id="learnMoreContent" style="color: #6d4c41">
|
| 617 |
+
<p class="mb-3">
|
| 618 |
+
Our AI pipeline includes advanced face detection, landmark alignment,
|
| 619 |
+
and generative adversarial networks (GANs) fine-tuned for age
|
| 620 |
+
progression and regression. The model adapts to various lighting,
|
| 621 |
+
angles, and ethnicities to provide consistent results.
|
| 622 |
+
</p>
|
| 623 |
+
<p>
|
| 624 |
+
We prioritize user privacy by processing images securely and never
|
| 625 |
+
storing personal data. The transformations happen in real-time,
|
| 626 |
+
offering a seamless and engaging experience.
|
| 627 |
+
</p>
|
| 628 |
+
</div>
|
| 629 |
+
</section>
|
| 630 |
+
|
| 631 |
+
<!-- ABOUT -->
|
| 632 |
+
<section aria-label="About FaceAging AI" class="max-w-5xl mx-auto px-6 sm:px-8 lg:px-12 py-16 scroll-animate"
|
| 633 |
+
id="about" tabindex="-1" style="color: #3e2723">
|
| 634 |
+
<h2 class="text-3xl font-extrabold mb-6 text-center select-none" style="color: #6d4c41">
|
| 635 |
+
About FaceAging AI
|
| 636 |
+
</h2>
|
| 637 |
+
<p class="max-w-3xl mx-auto leading-relaxed text-center">
|
| 638 |
+
<strong>FaceAging AI</strong> is a modern, AI-powered web application designed to transform facial
|
| 639 |
+
images by simulating realistic age progression and regression. Powered by cutting-edge deep learning
|
| 640 |
+
models and computer vision techniques, FaceAging AI allows users to visualize themselves at different
|
| 641 |
+
ages with high-quality, photorealistic results—directly through their browser.
|
| 642 |
+
</p>
|
| 643 |
+
<br />
|
| 644 |
+
<p class="max-w-3xl mx-auto leading-relaxed text-center">
|
| 645 |
+
Whether you're curious to see your older self or wish to recreate a youthful look, FaceAging AI provides
|
| 646 |
+
a secure, seamless, and intuitive experience. All uploaded images are handled with the utmost care and
|
| 647 |
+
privacy. The platform is designed to be fast, accessible, and user-friendly, requiring no advanced
|
| 648 |
+
technical skills to use.
|
| 649 |
+
</p>
|
| 650 |
+
<br />
|
| 651 |
+
<p class="max-w-3xl mx-auto leading-relaxed text-center">
|
| 652 |
+
This application was thoughtfully developed by <strong>Saksham Pathak</strong>, a passionate AI
|
| 653 |
+
researcher and developer currently pursuing a Master's degree in <strong>Artificial Intelligence and
|
| 654 |
+
Machine Learning</strong> at <strong>IIIT Lucknow</strong>. With a strong foundation in AI
|
| 655 |
+
technologies and a commitment to building ethical and impactful digital solutions, Saksham created
|
| 656 |
+
FaceAging AI to demonstrate how artificial intelligence can be applied to enhance everyday digital
|
| 657 |
+
experiences in a safe and meaningful way.
|
| 658 |
+
</p>
|
| 659 |
+
<br />
|
| 660 |
+
<p class="max-w-3xl mx-auto leading-relaxed text-center">
|
| 661 |
+
Thank you for using FaceAging AI. We hope you enjoy exploring the possibilities of AI-powered facial
|
| 662 |
+
transformation. Your feedback is always welcome and appreciated.
|
| 663 |
+
</p>
|
| 664 |
+
</section>
|
| 665 |
+
|
| 666 |
+
<!-- FEATURES -->
|
| 667 |
+
<section aria-label="Features of FaceAging AI"
|
| 668 |
+
class="max-w-6xl mx-auto px-6 sm:px-8 lg:px-12 py-16 scroll-animate" id="features" tabindex="-1">
|
| 669 |
+
<h2 class="text-3xl font-extrabold mb-10 text-center select-none" style="color: #3e2723">
|
| 670 |
+
Features
|
| 671 |
+
</h2>
|
| 672 |
+
<div class="features-grid">
|
| 673 |
+
<article aria-label="High accuracy feature" class="feature-card p-6 text-center focus:outline-none"
|
| 674 |
+
tabindex="0">
|
| 675 |
+
<i aria-hidden="true" class="fas fa-bullseye text-[#6D4C41] text-5xl mb-4"></i>
|
| 676 |
+
<h3 class="text-xl font-semibold mb-2" style="color: #4e342e">
|
| 677 |
+
High Accuracy
|
| 678 |
+
</h3>
|
| 679 |
+
<p>
|
| 680 |
+
Our AI models deliver precise and realistic age transformations,
|
| 681 |
+
preserving your unique facial features.
|
| 682 |
+
</p>
|
| 683 |
+
</article>
|
| 684 |
+
<article aria-label="Fast processing feature" class="feature-card p-6 text-center focus:outline-none"
|
| 685 |
+
tabindex="0">
|
| 686 |
+
<i aria-hidden="true" class="fas fa-bolt text-[#6D4C41] text-5xl mb-4"></i>
|
| 687 |
+
<h3 class="text-xl font-semibold mb-2" style="color: #4e342e">
|
| 688 |
+
Fast Processing
|
| 689 |
+
</h3>
|
| 690 |
+
<p>
|
| 691 |
+
Experience near-instant results with optimized AI pipelines and
|
| 692 |
+
efficient cloud processing.
|
| 693 |
+
</p>
|
| 694 |
+
</article>
|
| 695 |
+
<article aria-label="Seamless integration feature"
|
| 696 |
+
class="feature-card p-6 text-center focus:outline-none" tabindex="0">
|
| 697 |
+
<i aria-hidden="true" class="fas fa-plug text-[#6D4C41] text-5xl mb-4"></i>
|
| 698 |
+
<h3 class="text-xl font-semibold mb-2" style="color: #4e342e">
|
| 699 |
+
Seamless Integration
|
| 700 |
+
</h3>
|
| 701 |
+
<p>
|
| 702 |
+
Easily embed FaceAging AI into your apps or websites with our
|
| 703 |
+
flexible APIs and SDKs.
|
| 704 |
+
</p>
|
| 705 |
+
</article>
|
| 706 |
+
<article aria-label="Privacy focused feature" class="feature-card p-6 text-center focus:outline-none"
|
| 707 |
+
tabindex="0">
|
| 708 |
+
<i aria-hidden="true" class="fas fa-user-shield text-[#6D4C41] text-5xl mb-4"></i>
|
| 709 |
+
<h3 class="text-xl font-semibold mb-2" style="color: #4e342e">
|
| 710 |
+
Privacy Focused
|
| 711 |
+
</h3>
|
| 712 |
+
<p>
|
| 713 |
+
Your images are processed securely and never stored, ensuring your
|
| 714 |
+
privacy and data protection.
|
| 715 |
+
</p>
|
| 716 |
+
</article>
|
| 717 |
+
</div>
|
| 718 |
+
</section>
|
| 719 |
+
<!-- CONTACT -->
|
| 720 |
+
<section aria-label="Contact information" class="max-w-5xl mx-auto px-6 sm:px-8 lg:px-12 py-16 scroll-animate"
|
| 721 |
+
id="contact" tabindex="-1" style="color: #3e2723">
|
| 722 |
+
<h2 class="text-3xl font-extrabold mb-6 text-center select-none" style="color: #6d4c41">
|
| 723 |
+
Contact
|
| 724 |
+
</h2>
|
| 725 |
+
<p class="text-center max-w-3xl mx-auto mb-6" style="color: #4e342e">
|
| 726 |
+
Have questions or want to collaborate with us ? Reach out to us via email or
|
| 727 |
+
social media:
|
| 728 |
+
</p>
|
| 729 |
+
<ul class="flex flex-col sm:flex-row justify-center gap-8 text-lg max-w-3xl mx-auto" style="color: #3e2723">
|
| 730 |
+
<li class="flex items-center space-x-3">
|
| 731 |
+
<i class="fas fa-envelope text-[#6D4C41] text-xl"></i>
|
| 732 |
+
<a class="hover:text-[#4E342E] focus:outline-none focus:text-[#4E342E] transition"
|
| 733 |
+
href="mailto:pathaksaksham430@gmail.com" tabindex="0">pathaksaksham430@gmail.com</a>
|
| 734 |
+
</li>
|
| 735 |
+
<li class="flex items-center space-x-3">
|
| 736 |
+
<i class="fab fa-linkedin-in text-[#6D4C41] text-xl"></i>
|
| 737 |
+
<a class="hover:text-[#4E342E] focus:outline-none focus:text-[#4E342E] transition"
|
| 738 |
+
href="https://linkedin.com/in/sakshampathak" rel="noopener noreferrer" target="_blank"
|
| 739 |
+
tabindex="0">linkedin.com/in/sakshampathak</a>
|
| 740 |
+
</li>
|
| 741 |
+
<li class="flex items-center space-x-3">
|
| 742 |
+
<i class="fab fa-github text-[#6D4C41] text-xl"></i>
|
| 743 |
+
<a class="hover:text-[#4E342E] focus:outline-none focus:text-[#4E342E] transition"
|
| 744 |
+
href="https://github.com/parthmax2" rel="noopener noreferrer" target="_blank"
|
| 745 |
+
tabindex="0">github.com/parthmax2</a>
|
| 746 |
+
</li>
|
| 747 |
+
</ul>
|
| 748 |
+
</section>
|
| 749 |
+
</main>
|
| 750 |
+
<!-- FOOTER -->
|
| 751 |
+
<footer class="border-t border-[#D7CCC8] py-8 px-6 sm:px-12 mt-20" role="contentinfo" style="color: #3e2723">
|
| 752 |
+
<div class="max-w-7xl mx-auto flex flex-col sm:flex-row items-center justify-between space-y-6 sm:space-y-0">
|
| 753 |
+
<div class="flex space-x-6 text-2xl" style="color: #3e2723">
|
| 754 |
+
<a aria-label="Instagram"
|
| 755 |
+
class="hover:text-[#6D4C41] focus:outline-none focus:text-[#6D4C41] transition"
|
| 756 |
+
href="https://instagram.com/parthmax_" rel="noopener noreferrer" target="_blank" tabindex="0"><i
|
| 757 |
+
class="fab fa-instagram"></i></a>
|
| 758 |
+
<a aria-label="LinkedIn" class="hover:text-[#6D4C41] focus:outline-none focus:text-[#6D4C41] transition"
|
| 759 |
+
href="https://linkedin.com/sakshampathak" rel="noopener noreferrer" target="_blank" tabindex="0"><i
|
| 760 |
+
class="fab fa-linkedin-in"></i></a>
|
| 761 |
+
<a aria-label="GitHub" class="hover:text-[#6D4C41] focus:outline-none focus:text-[#6D4C41] transition"
|
| 762 |
+
href="https://github.com/parthmax2" rel="noopener noreferrer" target="_blank" tabindex="0"><i
|
| 763 |
+
class="fab fa-github"></i></a>
|
| 764 |
+
</div>
|
| 765 |
+
<p class="text-center sm:text-left text-sm max-w-xl select-none" style="color: #3e2723">
|
| 766 |
+
FaceAging AI © developed by <a aria-label="Instagram"
|
| 767 |
+
class="hover:text-[#6D4C41] focus:outline-none focus:text-[#6D4C41] transition"
|
| 768 |
+
href="https://instagram.com/parthmax_" rel="noopener noreferrer" target="_blank" tabindex="0"><strong>Parthmax</strong></a>
|
| 769 |
+
</p>
|
| 770 |
+
</div>
|
| 771 |
+
</footer>
|
| 772 |
+
<script src="/static/script.js"></script>
|
| 773 |
+
</body>
|
| 774 |
+
|
| 775 |
+
</html>
|