Update README.md
Browse files
README.md
CHANGED
|
@@ -1,19 +1,23 @@
|
|
| 1 |
---
|
| 2 |
title: Vision Model Interpretability
|
| 3 |
-
emoji:
|
| 4 |
colorFrom: red
|
| 5 |
colorTo: yellow
|
| 6 |
sdk: gradio
|
| 7 |
-
sdk_version:
|
| 8 |
app_file: app.py
|
| 9 |
pinned: false
|
| 10 |
license: mit
|
| 11 |
-
short_description: Interactive Grad-CAM visualization
|
| 12 |
---
|
| 13 |
|
| 14 |
-
# Vision Model Interpretability with Grad-CAM
|
|
|
|
| 15 |
|
| 16 |
Upload an image to:
|
| 17 |
-
|
| 18 |
-
-
|
| 19 |
-
-
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
---
|
| 2 |
title: Vision Model Interpretability
|
| 3 |
+
emoji: 🔍
|
| 4 |
colorFrom: red
|
| 5 |
colorTo: yellow
|
| 6 |
sdk: gradio
|
| 7 |
+
sdk_version: 5.49.1
|
| 8 |
app_file: app.py
|
| 9 |
pinned: false
|
| 10 |
license: mit
|
| 11 |
+
short_description: Interactive Grad-CAM visualization
|
| 12 |
---
|
| 13 |
|
| 14 |
+
# Vision Model Interpretability with Grad-CAM
|
| 15 |
+
**David Schechter**
|
| 16 |
|
| 17 |
Upload an image to:
|
| 18 |
+
|
| 19 |
+
- classify it using **ResNet-18**
|
| 20 |
+
- view the **top-3 predictions**
|
| 21 |
+
- visualize a **Grad-CAM heatmap** highlighting the image regions that influenced the model’s decision
|
| 22 |
+
|
| 23 |
+
This demo explores **model interpretability in computer vision** by showing how gradients from a convolutional neural network can be used to explain predictions.
|