Spaces:
Sleeping
Sleeping
File size: 609 Bytes
d79ceaf 216e171 d79ceaf 216e171 d79ceaf 216e171 d79ceaf 216e171 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 | ---
title: Adversarial Attack Demo
emoji: "\U0001F6E1\uFE0F"
colorFrom: red
colorTo: yellow
sdk: gradio
sdk_version: "5.29.0"
app_file: app.py
pinned: false
license: mit
---
# Adversarial Attack Demo | FGSM & PGD
Upload an image and watch how small, imperceptible perturbations can fool a neural network classifier.
**Courses**: 215 AI Safety ch1-ch2
## Features
- FGSM (Fast Gradient Sign Method) attack
- PGD (Projected Gradient Descent) iterative attack
- Side-by-side comparison: original vs perturbation vs adversarial
- Adjustable epsilon, step size, and iteration count
- L-inf / L2 / SSIM metrics
|