Update README.md
Browse files
README.md
CHANGED
|
@@ -1,3 +1,43 @@
|
|
| 1 |
-
---
|
| 2 |
-
license: mit
|
| 3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: mit
|
| 3 |
+
language:
|
| 4 |
+
- en
|
| 5 |
+
tags:
|
| 6 |
+
- neuralnetworks
|
| 7 |
+
- pytorch
|
| 8 |
+
- normaldistribution
|
| 9 |
+
- math
|
| 10 |
+
- noisydata
|
| 11 |
+
---
|
| 12 |
+
# Noisy Gaussian NN – Robustness to Label Noise
|
| 13 |
+
|
| 14 |
+
## Overview
|
| 15 |
+
This project explores how a simple 1-hidden-layer neural network handles increasing label noise when fitting a Gaussian curve.
|
| 16 |
+
We test three noise levels (σ = 0.05, 0.1, 0.2) to see when the network smooths effectively and when it starts to underfit.
|
| 17 |
+
|
| 18 |
+
## Dataset
|
| 19 |
+
- Synthetic dataset: Gaussian curve (`y = exp(-x^2)`)
|
| 20 |
+
- Noise added directly to labels using `torch.normal`
|
| 21 |
+
- 200 evenly spaced `x` points in [-2, 2]
|
| 22 |
+
|
| 23 |
+
## Model
|
| 24 |
+
- **Architecture:** 1 hidden layer, 50 neurons, `ReLU` activation
|
| 25 |
+
- **Loss:** MSELoss
|
| 26 |
+
- **Optimizer:** Adam (lr=0.01)
|
| 27 |
+
- **Training:** 2000 epochs
|
| 28 |
+
|
| 29 |
+
## Results
|
| 30 |
+
- Low noise: NN fits curve smoothly.
|
| 31 |
+
- Medium noise: Slight underfitting.
|
| 32 |
+
- High noise: Curve shape lost, noise dominates.
|
| 33 |
+
|
| 34 |
+
### Key Insight
|
| 35 |
+
> More noise ≠ better regularization.
|
| 36 |
+
> Too much noise can destroy the signal beyond recovery.
|
| 37 |
+
|
| 38 |
+
## Files
|
| 39 |
+
- `notebook.ipynb` – Full experiment, plots, and analysis
|
| 40 |
+
- `README.md` – This file
|
| 41 |
+
|
| 42 |
+
## License
|
| 43 |
+
MIT License – free to use, modify, and distribute with attribution.
|