dschechter27 commited on
Commit
738f7d4
·
verified ·
1 Parent(s): a89c082

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -7
README.md CHANGED
@@ -1,19 +1,23 @@
1
  ---
2
  title: Vision Model Interpretability
3
- emoji: 🌍
4
  colorFrom: red
5
  colorTo: yellow
6
  sdk: gradio
7
- sdk_version: 6.9.0
8
  app_file: app.py
9
  pinned: false
10
  license: mit
11
- short_description: Interactive Grad-CAM visualization for CNN image classificat
12
  ---
13
 
14
- # Vision Model Interpretability with Grad-CAM - David Schecheter
 
15
 
16
  Upload an image to:
17
- - classify it with ResNet-18
18
- - view the top-3 predictions
19
- - visualize a Grad-CAM heatmap showing which regions influenced the prediction
 
 
 
 
1
  ---
2
  title: Vision Model Interpretability
3
+ emoji: 🔍
4
  colorFrom: red
5
  colorTo: yellow
6
  sdk: gradio
7
+ sdk_version: 5.49.1
8
  app_file: app.py
9
  pinned: false
10
  license: mit
11
+ short_description: Interactive Grad-CAM visualization
12
  ---
13
 
14
+ # Vision Model Interpretability with Grad-CAM
15
+ **David Schechter**
16
 
17
  Upload an image to:
18
+
19
+ - classify it using **ResNet-18**
20
+ - view the **top-3 predictions**
21
+ - visualize a **Grad-CAM heatmap** highlighting the image regions that influenced the model’s decision
22
+
23
+ This demo explores **model interpretability in computer vision** by showing how gradients from a convolutional neural network can be used to explain predictions.