File size: 1,187 Bytes
f564790
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
27b70c1
f564790
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
---
license: mit
language:
- en
tags:
- neuralnetworks
- pytorch
- normaldistribution
- math
- noisydata
---
# Noisy Gaussian NN – Robustness to Label Noise

## Overview
This project explores how a simple 1-hidden-layer neural network handles increasing label noise when fitting a Gaussian curve.  
We test three noise levels (σ = 0.05, 0.1, 0.2) to see when the network smooths effectively and when it starts to underfit.

## Dataset
- Synthetic dataset: Gaussian curve (`y = exp(-x^2)`)
- Noise added directly to labels using `torch.normal`
- 200 evenly spaced `x` points in [-2, 2]

## Model
- **Architecture:** 1 hidden layer, 50 neurons, `ReLU` activation
- **Loss:** MSELoss
- **Optimizer:** Adam (lr=0.01)
- **Training:** 2000 epochs

## Results
- Low noise: NN fits curve smoothly.
- Medium noise: Slight underfitting.
- High noise: Curve shape lost, noise dominates.

### Key Insight
> More noise ≠ better regularization.  
> Too much noise can destroy the signal beyond recovery.

## Files
- `GaussianApproximation.ipynb` – Full experiment, plots, and analysis
- `README.md` – This file

## License
MIT License – free to use, modify, and distribute with attribution.