{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# ML Practice Series: Module 16 - Neural Networks (Deep Learning Foundations)\n",
"\n",
"Welcome to Module 16! We are entering the world of **Deep Learning**. We'll start with the building block of all neural networks: the **Perceptron** and the **Multi-Layer Perceptron (MLP)**.\n",
"\n",
"### Resources:\n",
"Visit your hub's **[Mathematics for Data Science](https://aashishgarg13.github.io/DataScience/math-ds-complete/)** section to review Calculus (Backpropagation/Partial Derivatives) which is the engine of Deep Learning.\n",
"\n",
"### Objectives:\n",
"1. **Neural Network Architecture**: Inputs, Hidden Layers, and Outputs.\n",
"2. **Activation Functions**: Sigmoid, ReLU, and Softmax.\n",
"3. **Training Process**: Forward Propagation & Backpropagation.\n",
"4. **Optimization**: Stochastic Gradient Descent (SGD) and Adam.\n",
"\n",
"---"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 1. Setup\n",
"We will use the **MNIST** dataset (Handwritten digits) but via Scikit-Learn's easy-to-use MLP interface for this foundation module."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import matplotlib.pyplot as plt\n",
"from sklearn.datasets import fetch_openml\n",
"from sklearn.neural_network import MLPClassifier\n",
"from sklearn.model_selection import train_test_split\n",
"from sklearn.preprocessing import StandardScaler\n",
"from sklearn.metrics import classification_report, confusion_matrix\n",
"\n",
"# Load digits (MNIST small version)\n",
"X, y = fetch_openml('mnist_784', version=1, return_X_y=True, as_frame=False, parser='auto')\n",
"\n",
"# Use a subset for speed in practice\n",
"X = X[:5000] / 255.0\n",
"y = y[:5000]\n",
"\n",
"X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)\n",
"print(\"Training Shape:\", X_train.shape)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 2. Multi-Layer Perceptron (MLP)\n",
"\n",
"### Task 1: Building the Network\n",
"Configure an `MLPClassifier` with:\n",
"1. Two hidden layers (size 50 each).\n",
"2. 'relu' activation function.\n",
"3. 'adam' solver.\n",
"4. Max 20 iterations to start."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# YOUR CODE HERE\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"\n",
"Click to see Solution
\n",
"\n",
"```python\n",
"mlp = MLPClassifier(hidden_layer_sizes=(50, 50), max_iter=20, alpha=1e-4,\n",
" solver='adam', verbose=10, random_state=1, \n",
" learning_rate_init=.1)\n",
"mlp.fit(X_train, y_train)\n",
"```\n",
" "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 3. Detailed Evaluation\n",
"\n",
"### Task 2: Confusion Matrix\n",
"Neural networks can often confuse similar digits (like 4 and 9). Plot the confusion matrix to see where your model is struggling."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import seaborn as sns\n",
"\n",
"# YOUR CODE HERE\n"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"\n",
"Click to see Solution
\n",
"\n",
"```python\n",
"y_pred = mlp.predict(X_test)\n",
"cm = confusion_matrix(y_test, y_pred)\n",
"plt.figure(figsize=(10,7))\n",
"sns.heatmap(cm, annot=True, fmt='d', cmap='Oranges')\n",
"plt.xlabel('Predicted')\n",
"plt.ylabel('Actual')\n",
"plt.show()\n",
"```\n",
" "
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"--- \n",
"### Congratulations! \n",
"You've trained your first Neural Network. This is the foundation for Computer Vision and NLP.\n",
"Next: **Reinforcement Learning**."
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.7"
}
},
"nbformat": 4,
"nbformat_minor": 4
}