{ "cells": [ { "cell_type": "markdown", "id": "0d580912", "metadata": {}, "source": [ "# 🧠 Deep Learning Security Models\n", "\n", "## Advanced Neural Networks for Cybersecurity\n", "\n", "This notebook focuses on training **deep learning models** for security classification:\n", "\n", "- **Transformer-based Detection** - Attention mechanisms for sequence analysis\n", "- **Convolutional Networks** - Pattern detection in security data\n", "- **LSTM/GRU Networks** - Temporal pattern recognition\n", "- **AutoEncoders** - Anomaly detection via reconstruction error\n", "- **Multi-Task Learning** - Unified model for multiple security domains" ] }, { "cell_type": "code", "execution_count": 1, "id": "2a6ddc2d", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "🐍 Current Python: 3.15.0a3 (v3.15.0a3:f1eb0c0b0cd, Dec 16 2025, 08:05:19) [Clang 17.0.0 (clang-1700.6.3.2)]\n", "⚠️ Python 3.15 detected. TensorFlow requires Python 3.9-3.11\n", " Installing other packages without TensorFlow...\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ " \u001b[1;31merror\u001b[0m: \u001b[1msubprocess-exited-with-error\u001b[0m\n", " \n", " \u001b[31m×\u001b[0m \u001b[32minstalling build dependencies for scikit-learn\u001b[0m did not run successfully.\n", " \u001b[31m│\u001b[0m exit code: \u001b[1;36m1\u001b[0m\n", " \u001b[31m╰─>\u001b[0m \u001b[31m[81 lines of output]\u001b[0m\n", " \u001b[31m \u001b[0m Collecting meson-python<0.19.0,>=0.17.1\n", " \u001b[31m \u001b[0m Using cached meson_python-0.18.0-py3-none-any.whl.metadata (2.8 kB)\n", " \u001b[31m \u001b[0m Collecting cython<3.3.0,>=3.1.2\n", " \u001b[31m \u001b[0m Using cached cython-3.2.4-cp39-abi3-macosx_10_9_x86_64.whl.metadata (7.5 kB)\n", " \u001b[31m \u001b[0m Collecting numpy<2.4.0,>=2\n", " \u001b[31m \u001b[0m Using cached numpy-2.3.5.tar.gz (20.6 MB)\n", " \u001b[31m \u001b[0m Installing build dependencies: started\n", " \u001b[31m \u001b[0m Installing build dependencies: finished with status 'done'\n", " \u001b[31m \u001b[0m Getting requirements to build wheel: started\n", " \u001b[31m \u001b[0m Getting requirements to build wheel: finished with status 'done'\n", " \u001b[31m \u001b[0m Installing backend dependencies: started\n", " \u001b[31m \u001b[0m Installing backend dependencies: finished with status 'done'\n", " \u001b[31m \u001b[0m Preparing metadata (pyproject.toml): started\n", " \u001b[31m \u001b[0m Preparing metadata (pyproject.toml): still running...\n", " \u001b[31m \u001b[0m Preparing metadata (pyproject.toml): still running...\n", " \u001b[31m \u001b[0m Preparing metadata (pyproject.toml): still running...\n", " \u001b[31m \u001b[0m Preparing metadata (pyproject.toml): still running...\n", " \u001b[31m \u001b[0m Preparing metadata (pyproject.toml): still running...\n", " \u001b[31m \u001b[0m Preparing metadata (pyproject.toml): still running...\n", " \u001b[31m \u001b[0m Preparing metadata (pyproject.toml): still running...\n", " \u001b[31m \u001b[0m Preparing metadata (pyproject.toml): still running...\n", " \u001b[31m \u001b[0m Preparing metadata (pyproject.toml): still running...\n", " \u001b[31m \u001b[0m Preparing metadata (pyproject.toml): still running...\n", " \u001b[31m \u001b[0m Preparing metadata (pyproject.toml): still running...\n", " \u001b[31m \u001b[0m Preparing metadata (pyproject.toml): still running...\n", " \u001b[31m \u001b[0m Preparing metadata (pyproject.toml): still running...\n", " \u001b[31m \u001b[0m Preparing metadata (pyproject.toml): still running...\n", " \u001b[31m \u001b[0m Preparing metadata (pyproject.toml): still running...\n", " \u001b[31m \u001b[0m Preparing metadata (pyproject.toml): still running...\n", " \u001b[31m \u001b[0m Preparing metadata (pyproject.toml): still running...\n", " \u001b[31m \u001b[0m Preparing metadata (pyproject.toml): still running...\n", " \u001b[31m \u001b[0m Preparing metadata (pyproject.toml): finished with status 'done'\n", " \u001b[31m \u001b[0m \u001b[33mWARNING: Retrying (Retry(total=4, connect=None, read=None, redirect=None, status=None)) after connection broken by 'ProtocolError('Connection aborted.', ConnectionResetError(54, 'Connection reset by peer'))': /simple/scipy/\u001b[0m\u001b[33m\n", " \u001b[31m \u001b[0m \u001b[0mCollecting scipy<1.17.0,>=1.10.0\n", " \u001b[31m \u001b[0m Using cached scipy-1.16.3.tar.gz (30.6 MB)\n", " \u001b[31m \u001b[0m Installing build dependencies: started\n", " \u001b[31m \u001b[0m Installing build dependencies: finished with status 'done'\n", " \u001b[31m \u001b[0m Getting requirements to build wheel: started\n", " \u001b[31m \u001b[0m Getting requirements to build wheel: finished with status 'done'\n", " \u001b[31m \u001b[0m Installing backend dependencies: started\n", " \u001b[31m \u001b[0m Installing backend dependencies: finished with status 'done'\n", " \u001b[31m \u001b[0m Preparing metadata (pyproject.toml): started\n", " \u001b[31m \u001b[0m Preparing metadata (pyproject.toml): finished with status 'error'\n", " \u001b[31m \u001b[0m \u001b[1;31merror\u001b[0m: \u001b[1msubprocess-exited-with-error\u001b[0m\n", " \u001b[31m \u001b[0m \n", " \u001b[31m \u001b[0m \u001b[31m×\u001b[0m \u001b[32mPreparing metadata \u001b[0m\u001b[1;32m(\u001b[0m\u001b[32mpyproject.toml\u001b[0m\u001b[1;32m)\u001b[0m did not run successfully.\n", " \u001b[31m \u001b[0m \u001b[31m│\u001b[0m exit code: \u001b[1;36m1\u001b[0m\n", " \u001b[31m \u001b[0m \u001b[31m╰─>\u001b[0m \u001b[31m[23 lines of output]\u001b[0m\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m \u001b[36m\u001b[1m+ meson setup /private/var/folders/3f/7mz66tl156s4w_xt0pqq7bwc0000gn/T/pip-install-iutka178/scipy_bdc2fda37451456fa9ccb51189c51876 /private/var/folders/3f/7mz66tl156s4w_xt0pqq7bwc0000gn/T/pip-install-iutka178/scipy_bdc2fda37451456fa9ccb51189c51876/.mesonpy-3_laly6u -Dbuildtype=release -Db_ndebug=if-release -Db_vscrt=md --native-file=/private/var/folders/3f/7mz66tl156s4w_xt0pqq7bwc0000gn/T/pip-install-iutka178/scipy_bdc2fda37451456fa9ccb51189c51876/.mesonpy-3_laly6u/meson-python-native-file.ini\u001b[0m\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m The Meson build system\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m Version: 1.10.1\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m Source dir: /private/var/folders/3f/7mz66tl156s4w_xt0pqq7bwc0000gn/T/pip-install-iutka178/scipy_bdc2fda37451456fa9ccb51189c51876\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m Build dir: /private/var/folders/3f/7mz66tl156s4w_xt0pqq7bwc0000gn/T/pip-install-iutka178/scipy_bdc2fda37451456fa9ccb51189c51876/.mesonpy-3_laly6u\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m Build type: native build\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m Project name: scipy\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m Project version: 1.16.3\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m C compiler for the host machine: cc (clang 14.0.3 \"Apple clang version 14.0.3 (clang-1403.0.22.14.1)\")\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m C linker for the host machine: cc ld64 857.1\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m C++ compiler for the host machine: c++ (clang 14.0.3 \"Apple clang version 14.0.3 (clang-1403.0.22.14.1)\")\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m C++ linker for the host machine: c++ ld64 857.1\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m Cython compiler for the host machine: cython (cython 3.1.8)\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m Host machine cpu family: x86_64\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m Host machine cpu: x86_64\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m Program python found: YES (/Users/Dadaicon/Documents/GitHub/Real-Time-cyber-Forge-Agentic-AI/.venv/bin/python)\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m Found pkg-config: YES (/usr/local/bin/pkg-config) 2.5.1\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m Run-time dependency python found: YES 3.15\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m Program cython found: YES (/private/var/folders/3f/7mz66tl156s4w_xt0pqq7bwc0000gn/T/pip-build-env-dno50jhk/overlay/bin/cython)\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m ../meson.build:53:4: ERROR: Problem encountered: SciPy requires clang >= 15.0\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m A full log can be found at /private/var/folders/3f/7mz66tl156s4w_xt0pqq7bwc0000gn/T/pip-install-iutka178/scipy_bdc2fda37451456fa9ccb51189c51876/.mesonpy-3_laly6u/meson-logs/meson-log.txt\n", " \u001b[31m \u001b[0m \u001b[31m \u001b[0m \u001b[31m[end of output]\u001b[0m\n", " \u001b[31m \u001b[0m \n", " \u001b[31m \u001b[0m \u001b[1;35mnote\u001b[0m: This error originates from a subprocess, and is likely not a problem with pip.\n", " \u001b[31m \u001b[0m \u001b[1;31merror\u001b[0m: \u001b[1mmetadata-generation-failed\u001b[0m\n", " \u001b[31m \u001b[0m \n", " \u001b[31m \u001b[0m \u001b[31m×\u001b[0m Encountered error while generating package metadata.\n", " \u001b[31m \u001b[0m \u001b[31m╰─>\u001b[0m scipy\n", " \u001b[31m \u001b[0m \n", " \u001b[31m \u001b[0m \u001b[1;35mnote\u001b[0m: This is an issue with the package mentioned above, not pip.\n", " \u001b[31m \u001b[0m \u001b[1;36mhint\u001b[0m: See above for details.\n", " \u001b[31m \u001b[0m \u001b[31m[end of output]\u001b[0m\n", " \n", " \u001b[1;35mnote\u001b[0m: This error originates from a subprocess, and is likely not a problem with pip.\n", "\u001b[31mERROR: Failed to build 'scikit-learn' when installing build dependencies for scikit-learn\u001b[0m\u001b[31m\n", "\u001b[0mNote: you may need to restart the kernel to use updated packages.\n", "✅ Packages installed (without TensorFlow)\n", " Please switch to Python 3.9-3.11 kernel to use deep learning models\n" ] } ], "source": [ "# Install required packages using pip magic (ensures correct kernel environment)\n", "# Note: TensorFlow requires Python 3.9-3.11. If you see errors, switch to venv kernel or use Python 3.11\n", "\n", "import sys\n", "print(f'🐍 Current Python: {sys.version}')\n", "\n", "# Check Python version\n", "major, minor = sys.version_info[:2]\n", "if major == 3 and 9 <= minor <= 11:\n", " %pip install -q tensorflow scikit-learn pandas numpy matplotlib seaborn imbalanced-learn nest_asyncio tqdm\n", " print('✅ All packages installed including TensorFlow')\n", "else:\n", " print(f'⚠️ Python {major}.{minor} detected. TensorFlow requires Python 3.9-3.11')\n", " print(' Installing other packages without TensorFlow...')\n", " %pip install -q scikit-learn pandas numpy matplotlib seaborn imbalanced-learn nest_asyncio tqdm\n", " print('✅ Packages installed (without TensorFlow)')\n", " print(' Please switch to Python 3.9-3.11 kernel to use deep learning models')" ] }, { "cell_type": "code", "execution_count": 3, "id": "f1af9c6b", "metadata": {}, "outputs": [ { "ename": "ModuleNotFoundError", "evalue": "No module named 'matplotlib'", "output_type": "error", "traceback": [ "\u001b[31m---------------------------------------------------------------------------\u001b[39m", "\u001b[31mModuleNotFoundError\u001b[39m Traceback (most recent call last)", "\u001b[36mCell\u001b[39m\u001b[36m \u001b[39m\u001b[32mIn[3]\u001b[39m\u001b[32m, line 7\u001b[39m\n\u001b[32m 5\u001b[39m \u001b[38;5;28;01mimport\u001b[39;00m\u001b[38;5;250m \u001b[39m\u001b[34;01mnumpy\u001b[39;00m\u001b[38;5;250m \u001b[39m\u001b[38;5;28;01mas\u001b[39;00m\u001b[38;5;250m \u001b[39m\u001b[34;01mnp\u001b[39;00m\n\u001b[32m 6\u001b[39m \u001b[38;5;28;01mimport\u001b[39;00m\u001b[38;5;250m \u001b[39m\u001b[34;01mpandas\u001b[39;00m\u001b[38;5;250m \u001b[39m\u001b[38;5;28;01mas\u001b[39;00m\u001b[38;5;250m \u001b[39m\u001b[34;01mpd\u001b[39;00m\n\u001b[32m----> \u001b[39m\u001b[32m7\u001b[39m \u001b[38;5;28;01mimport\u001b[39;00m\u001b[38;5;250m \u001b[39m\u001b[34;01mmatplotlib\u001b[39;00m\u001b[34;01m.\u001b[39;00m\u001b[34;01mpyplot\u001b[39;00m\u001b[38;5;250m \u001b[39m\u001b[38;5;28;01mas\u001b[39;00m\u001b[38;5;250m \u001b[39m\u001b[34;01mplt\u001b[39;00m\n\u001b[32m 8\u001b[39m \u001b[38;5;28;01mimport\u001b[39;00m\u001b[38;5;250m \u001b[39m\u001b[34;01mseaborn\u001b[39;00m\u001b[38;5;250m \u001b[39m\u001b[38;5;28;01mas\u001b[39;00m\u001b[38;5;250m \u001b[39m\u001b[34;01msns\u001b[39;00m\n\u001b[32m 9\u001b[39m \u001b[38;5;28;01mfrom\u001b[39;00m\u001b[38;5;250m \u001b[39m\u001b[34;01mpathlib\u001b[39;00m\u001b[38;5;250m \u001b[39m\u001b[38;5;28;01mimport\u001b[39;00m Path\n", "\u001b[31mModuleNotFoundError\u001b[39m: No module named 'matplotlib'" ] } ], "source": [ "import os\n", "import sys\n", "import asyncio\n", "import warnings\n", "import numpy as np\n", "import pandas as pd\n", "import matplotlib.pyplot as plt\n", "import seaborn as sns\n", "from pathlib import Path\n", "from datetime import datetime\n", "import json\n", "import joblib\n", "\n", "# ML\n", "from sklearn.model_selection import train_test_split, StratifiedKFold\n", "from sklearn.preprocessing import StandardScaler, LabelEncoder\n", "from sklearn.metrics import (\n", " classification_report, confusion_matrix, roc_auc_score,\n", " roc_curve, precision_recall_curve, f1_score, accuracy_score\n", ")\n", "\n", "# Deep Learning\n", "import tensorflow as tf\n", "from tensorflow.keras.models import Model, Sequential\n", "from tensorflow.keras.layers import (\n", " Input, Dense, Dropout, BatchNormalization, \n", " Conv1D, MaxPooling1D, GlobalMaxPooling1D, Flatten,\n", " LSTM, GRU, Bidirectional, Attention, MultiHeadAttention,\n", " Concatenate, Add, LayerNormalization, Embedding\n", ")\n", "from tensorflow.keras.optimizers import Adam, AdamW\n", "from tensorflow.keras.callbacks import EarlyStopping, ReduceLROnPlateau, ModelCheckpoint\n", "from tensorflow.keras.regularizers import l1_l2\n", "\n", "from imblearn.over_sampling import SMOTE\n", "\n", "# Config\n", "warnings.filterwarnings('ignore')\n", "os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'\n", "np.random.seed(42)\n", "tf.random.set_seed(42)\n", "\n", "# Add path\n", "sys.path.insert(0, str(Path.cwd().parent / 'app' / 'services'))\n", "\n", "try:\n", " import nest_asyncio\n", " nest_asyncio.apply()\n", "except:\n", " pass\n", "\n", "plt.style.use('dark_background')\n", "\n", "print('🚀 Environment ready!')\n", "print(f' TensorFlow: {tf.__version__}')\n", "print(f' GPU available: {len(tf.config.list_physical_devices(\"GPU\")) > 0}')" ] }, { "cell_type": "markdown", "id": "7962e94f", "metadata": {}, "source": [ "## 📥 Load Security Datasets" ] }, { "cell_type": "code", "execution_count": null, "id": "65ed96aa", "metadata": {}, "outputs": [], "source": [ "from web_security_datasets import WebSecurityDatasetManager\n", "\n", "DATASET_DIR = Path.cwd().parent / 'datasets' / 'web_security'\n", "manager = WebSecurityDatasetManager(str(DATASET_DIR))\n", "\n", "# Download if needed\n", "async def ensure_datasets():\n", " if len(manager.downloaded_datasets) < 5:\n", " print('📥 Downloading datasets...')\n", " await manager.download_all_datasets()\n", " return manager.downloaded_datasets\n", "\n", "datasets = asyncio.run(ensure_datasets())\n", "print(f'\\n✅ {len(datasets)} datasets available')" ] }, { "cell_type": "code", "execution_count": null, "id": "369d8983", "metadata": {}, "outputs": [], "source": [ "# Load combined dataset for multi-domain training\n", "async def load_combined(max_per_ds: int = 20000):\n", " return await manager.get_combined_dataset(max_samples_per_dataset=max_per_ds)\n", "\n", "combined_df = asyncio.run(load_combined())\n", "print(f'📊 Combined dataset: {len(combined_df):,} samples')\n", "print(f' Features: {combined_df.shape[1]}')\n", "print(f' Categories: {combined_df[\"_category\"].value_counts().to_dict()}')" ] }, { "cell_type": "markdown", "id": "3fc0c63d", "metadata": {}, "source": [ "## 🏗️ Deep Learning Architectures" ] }, { "cell_type": "code", "execution_count": null, "id": "f834f8a9", "metadata": {}, "outputs": [], "source": [ "class DeepSecurityModels:\n", " \"\"\"Advanced deep learning models for security classification.\"\"\"\n", " \n", " @staticmethod\n", " def transformer_block(x, embed_dim, num_heads, ff_dim, dropout=0.1):\n", " \"\"\"Transformer encoder block.\"\"\"\n", " # Multi-head attention\n", " attn_output = MultiHeadAttention(\n", " key_dim=embed_dim, num_heads=num_heads, dropout=dropout\n", " )(x, x)\n", " x1 = LayerNormalization(epsilon=1e-6)(x + attn_output)\n", " \n", " # Feed-forward\n", " ff = Dense(ff_dim, activation='relu')(x1)\n", " ff = Dropout(dropout)(ff)\n", " ff = Dense(embed_dim)(ff)\n", " return LayerNormalization(epsilon=1e-6)(x1 + ff)\n", " \n", " @staticmethod\n", " def create_transformer_classifier(input_dim: int, \n", " embed_dim: int = 64,\n", " num_heads: int = 4,\n", " ff_dim: int = 128,\n", " num_blocks: int = 2) -> Model:\n", " \"\"\"Transformer-based security classifier.\"\"\"\n", " inputs = Input(shape=(input_dim,))\n", " \n", " # Project to embedding dimension\n", " x = Dense(embed_dim)(inputs)\n", " x = tf.expand_dims(x, axis=1) # Add sequence dimension\n", " \n", " # Stack transformer blocks\n", " for _ in range(num_blocks):\n", " x = DeepSecurityModels.transformer_block(x, embed_dim, num_heads, ff_dim)\n", " \n", " # Global pooling and classification\n", " x = tf.squeeze(x, axis=1)\n", " x = Dropout(0.2)(x)\n", " x = Dense(32, activation='relu')(x)\n", " outputs = Dense(1, activation='sigmoid')(x)\n", " \n", " model = Model(inputs, outputs, name='transformer_classifier')\n", " model.compile(\n", " optimizer=AdamW(learning_rate=1e-4),\n", " loss='binary_crossentropy',\n", " metrics=['accuracy', 'AUC']\n", " )\n", " return model\n", " \n", " @staticmethod\n", " def create_cnn_classifier(input_dim: int) -> Model:\n", " \"\"\"1D CNN for security pattern detection.\"\"\"\n", " inputs = Input(shape=(input_dim, 1))\n", " \n", " # Conv blocks\n", " x = Conv1D(64, 3, activation='relu', padding='same')(inputs)\n", " x = BatchNormalization()(x)\n", " x = MaxPooling1D(2)(x)\n", " \n", " x = Conv1D(128, 3, activation='relu', padding='same')(x)\n", " x = BatchNormalization()(x)\n", " x = MaxPooling1D(2)(x)\n", " \n", " x = Conv1D(256, 3, activation='relu', padding='same')(x)\n", " x = GlobalMaxPooling1D()(x)\n", " \n", " # Classification head\n", " x = Dense(64, activation='relu')(x)\n", " x = Dropout(0.3)(x)\n", " outputs = Dense(1, activation='sigmoid')(x)\n", " \n", " model = Model(inputs, outputs, name='cnn_classifier')\n", " model.compile(\n", " optimizer=Adam(learning_rate=1e-3),\n", " loss='binary_crossentropy',\n", " metrics=['accuracy', 'AUC']\n", " )\n", " return model\n", " \n", " @staticmethod\n", " def create_lstm_classifier(input_dim: int) -> Model:\n", " \"\"\"Bidirectional LSTM for sequence analysis.\"\"\"\n", " inputs = Input(shape=(input_dim, 1))\n", " \n", " x = Bidirectional(LSTM(64, return_sequences=True))(inputs)\n", " x = Dropout(0.3)(x)\n", " x = Bidirectional(LSTM(32))(x)\n", " x = Dropout(0.3)(x)\n", " \n", " x = Dense(32, activation='relu')(x)\n", " outputs = Dense(1, activation='sigmoid')(x)\n", " \n", " model = Model(inputs, outputs, name='lstm_classifier')\n", " model.compile(\n", " optimizer=Adam(learning_rate=1e-3),\n", " loss='binary_crossentropy',\n", " metrics=['accuracy', 'AUC']\n", " )\n", " return model\n", " \n", " @staticmethod\n", " def create_autoencoder(input_dim: int, encoding_dim: int = 32) -> tuple:\n", " \"\"\"Autoencoder for anomaly detection.\"\"\"\n", " # Encoder\n", " inputs = Input(shape=(input_dim,))\n", " x = Dense(128, activation='relu')(inputs)\n", " x = BatchNormalization()(x)\n", " x = Dense(64, activation='relu')(x)\n", " x = BatchNormalization()(x)\n", " encoded = Dense(encoding_dim, activation='relu', name='encoding')(x)\n", " \n", " # Decoder\n", " x = Dense(64, activation='relu')(encoded)\n", " x = BatchNormalization()(x)\n", " x = Dense(128, activation='relu')(x)\n", " x = BatchNormalization()(x)\n", " decoded = Dense(input_dim, activation='linear')(x)\n", " \n", " autoencoder = Model(inputs, decoded, name='autoencoder')\n", " autoencoder.compile(optimizer=Adam(1e-3), loss='mse')\n", " \n", " encoder = Model(inputs, encoded, name='encoder')\n", " \n", " return autoencoder, encoder\n", " \n", " @staticmethod\n", " def create_multi_task_model(input_dim: int, num_tasks: int = 3) -> Model:\n", " \"\"\"Multi-task model for multiple security domains.\"\"\"\n", " inputs = Input(shape=(input_dim,))\n", " \n", " # Shared layers\n", " shared = Dense(256, activation='relu')(inputs)\n", " shared = BatchNormalization()(shared)\n", " shared = Dropout(0.3)(shared)\n", " shared = Dense(128, activation='relu')(shared)\n", " shared = BatchNormalization()(shared)\n", " shared = Dropout(0.2)(shared)\n", " shared = Dense(64, activation='relu')(shared)\n", " \n", " # Task-specific heads\n", " outputs = []\n", " task_names = ['phishing', 'malware', 'intrusion']\n", " for i in range(min(num_tasks, len(task_names))):\n", " task_layer = Dense(32, activation='relu', name=f'{task_names[i]}_hidden')(shared)\n", " task_output = Dense(1, activation='sigmoid', name=f'{task_names[i]}_output')(task_layer)\n", " outputs.append(task_output)\n", " \n", " model = Model(inputs, outputs, name='multi_task_security')\n", " model.compile(\n", " optimizer=Adam(1e-3),\n", " loss={f'{task_names[i]}_output': 'binary_crossentropy' for i in range(len(outputs))},\n", " metrics=['accuracy']\n", " )\n", " return model\n", "\n", "print('✅ Deep learning architectures defined')" ] }, { "cell_type": "markdown", "id": "abdaab25", "metadata": {}, "source": [ "## 🎯 Training Pipeline" ] }, { "cell_type": "code", "execution_count": null, "id": "673c6e4b", "metadata": {}, "outputs": [], "source": [ "def prepare_data_for_training(df: pd.DataFrame, max_features: int = 50) -> tuple:\n", " \"\"\"Prepare data for deep learning training.\"\"\"\n", " \n", " # Find target column\n", " target_candidates = ['is_malicious', 'is_attack', 'is_malware', 'is_spam', \n", " 'is_dga', 'is_miner', 'label', 'result']\n", " target_col = None\n", " for col in target_candidates:\n", " if col in df.columns:\n", " target_col = col\n", " break\n", " \n", " if target_col is None:\n", " # Find binary column\n", " for col in df.columns:\n", " if df[col].nunique() == 2 and col not in ['_category', '_dataset_id']:\n", " target_col = col\n", " break\n", " \n", " if target_col is None:\n", " raise ValueError('No target column found')\n", " \n", " # Select numeric features\n", " exclude = [target_col, '_category', '_dataset_id', 'source_dataset', 'url', 'payload', 'domain']\n", " feature_cols = [c for c in df.select_dtypes(include=[np.number]).columns if c not in exclude]\n", " \n", " # Limit features\n", " if len(feature_cols) > max_features:\n", " feature_cols = feature_cols[:max_features]\n", " \n", " X = df[feature_cols].fillna(0).replace([np.inf, -np.inf], 0)\n", " y = df[target_col].astype(int)\n", " \n", " # Scale\n", " scaler = StandardScaler()\n", " X_scaled = scaler.fit_transform(X)\n", " \n", " return X_scaled, y.values, feature_cols, scaler\n", "\n", "# Prepare data\n", "X, y, features, scaler = prepare_data_for_training(combined_df)\n", "print(f'📊 Data prepared: {X.shape}')\n", "print(f' Features: {len(features)}')\n", "print(f' Class balance: {np.bincount(y)}')" ] }, { "cell_type": "code", "execution_count": null, "id": "9caabf5f", "metadata": {}, "outputs": [], "source": [ "# Split and balance data\n", "X_train, X_test, y_train, y_test = train_test_split(\n", " X, y, test_size=0.2, random_state=42, stratify=y\n", ")\n", "\n", "# Balance training data\n", "try:\n", " smote = SMOTE(random_state=42)\n", " X_train_balanced, y_train_balanced = smote.fit_resample(X_train, y_train)\n", " print(f'✅ After SMOTE: {len(X_train_balanced):,} training samples')\n", "except:\n", " X_train_balanced, y_train_balanced = X_train, y_train\n", " print('⚠️ SMOTE skipped')\n", "\n", "print(f' Train: {len(X_train_balanced):,} | Test: {len(X_test):,}')" ] }, { "cell_type": "code", "execution_count": null, "id": "ccee951f", "metadata": {}, "outputs": [], "source": [ "# Training callbacks\n", "callbacks = [\n", " EarlyStopping(monitor='val_loss', patience=10, restore_best_weights=True),\n", " ReduceLROnPlateau(monitor='val_loss', factor=0.2, patience=5, min_lr=1e-6)\n", "]\n", "\n", "# Train Transformer model\n", "print('🔄 Training Transformer model...')\n", "transformer = DeepSecurityModels.create_transformer_classifier(X.shape[1])\n", "\n", "history_transformer = transformer.fit(\n", " X_train_balanced, y_train_balanced,\n", " validation_split=0.2,\n", " epochs=50,\n", " batch_size=64,\n", " callbacks=callbacks,\n", " verbose=1\n", ")\n", "\n", "transformer_pred = (transformer.predict(X_test, verbose=0) > 0.5).astype(int).flatten()\n", "transformer_auc = roc_auc_score(y_test, transformer.predict(X_test, verbose=0))\n", "print(f'\\n✅ Transformer AUC: {transformer_auc:.4f}')" ] }, { "cell_type": "code", "execution_count": null, "id": "5d0c55b2", "metadata": {}, "outputs": [], "source": [ "# Train CNN model\n", "print('🔄 Training CNN model...')\n", "\n", "X_train_cnn = X_train_balanced.reshape(-1, X_train_balanced.shape[1], 1)\n", "X_test_cnn = X_test.reshape(-1, X_test.shape[1], 1)\n", "\n", "cnn = DeepSecurityModels.create_cnn_classifier(X.shape[1])\n", "\n", "history_cnn = cnn.fit(\n", " X_train_cnn, y_train_balanced,\n", " validation_split=0.2,\n", " epochs=50,\n", " batch_size=64,\n", " callbacks=callbacks,\n", " verbose=1\n", ")\n", "\n", "cnn_pred = (cnn.predict(X_test_cnn, verbose=0) > 0.5).astype(int).flatten()\n", "cnn_auc = roc_auc_score(y_test, cnn.predict(X_test_cnn, verbose=0))\n", "print(f'\\n✅ CNN AUC: {cnn_auc:.4f}')" ] }, { "cell_type": "code", "execution_count": null, "id": "3299c3c0", "metadata": {}, "outputs": [], "source": [ "# Train LSTM model\n", "print('🔄 Training LSTM model...')\n", "\n", "lstm = DeepSecurityModels.create_lstm_classifier(X.shape[1])\n", "\n", "history_lstm = lstm.fit(\n", " X_train_cnn, y_train_balanced, # Same shape as CNN\n", " validation_split=0.2,\n", " epochs=30, # LSTM is slower\n", " batch_size=64,\n", " callbacks=callbacks,\n", " verbose=1\n", ")\n", "\n", "lstm_pred = (lstm.predict(X_test_cnn, verbose=0) > 0.5).astype(int).flatten()\n", "lstm_auc = roc_auc_score(y_test, lstm.predict(X_test_cnn, verbose=0))\n", "print(f'\\n✅ LSTM AUC: {lstm_auc:.4f}')" ] }, { "cell_type": "code", "execution_count": null, "id": "c47177bf", "metadata": {}, "outputs": [], "source": [ "# Train Autoencoder for anomaly detection\n", "print('🔄 Training Autoencoder...')\n", "\n", "# Train only on normal samples\n", "X_normal = X_train_balanced[y_train_balanced == 0]\n", "\n", "autoencoder, encoder = DeepSecurityModels.create_autoencoder(X.shape[1])\n", "\n", "history_ae = autoencoder.fit(\n", " X_normal, X_normal,\n", " validation_split=0.2,\n", " epochs=50,\n", " batch_size=64,\n", " callbacks=callbacks,\n", " verbose=1\n", ")\n", "\n", "# Anomaly scores based on reconstruction error\n", "reconstructions = autoencoder.predict(X_test, verbose=0)\n", "mse = np.mean(np.power(X_test - reconstructions, 2), axis=1)\n", "threshold = np.percentile(mse, 90) # Top 10% as anomalies\n", "ae_pred = (mse > threshold).astype(int)\n", "ae_auc = roc_auc_score(y_test, mse)\n", "print(f'\\n✅ Autoencoder AUC: {ae_auc:.4f}')" ] }, { "cell_type": "markdown", "id": "874d717c", "metadata": {}, "source": [ "## 📊 Model Comparison" ] }, { "cell_type": "code", "execution_count": null, "id": "58a05f84", "metadata": {}, "outputs": [], "source": [ "# Compare all models\n", "results = {\n", " 'Transformer': {'pred': transformer_pred, 'auc': transformer_auc},\n", " 'CNN': {'pred': cnn_pred, 'auc': cnn_auc},\n", " 'LSTM': {'pred': lstm_pred, 'auc': lstm_auc},\n", " 'Autoencoder': {'pred': ae_pred, 'auc': ae_auc}\n", "}\n", "\n", "# Results table\n", "print('📊 Deep Learning Model Comparison')\n", "print('=' * 60)\n", "print(f'{\"Model\":<15} {\"Accuracy\":<12} {\"F1\":<12} {\"AUC\":<12}')\n", "print('-' * 60)\n", "\n", "for name, res in results.items():\n", " acc = accuracy_score(y_test, res['pred'])\n", " f1 = f1_score(y_test, res['pred'])\n", " print(f'{name:<15} {acc:<12.4f} {f1:<12.4f} {res[\"auc\"]:<12.4f}')\n", "\n", "# Best model\n", "best_model = max(results.items(), key=lambda x: x[1]['auc'])\n", "print(f'\\n🏆 Best Model: {best_model[0]} (AUC: {best_model[1][\"auc\"]:.4f})')" ] }, { "cell_type": "code", "execution_count": null, "id": "6ffe5221", "metadata": {}, "outputs": [], "source": [ "# Visualize ROC curves\n", "plt.figure(figsize=(10, 8))\n", "\n", "# Get probabilities\n", "probs = {\n", " 'Transformer': transformer.predict(X_test, verbose=0).flatten(),\n", " 'CNN': cnn.predict(X_test_cnn, verbose=0).flatten(),\n", " 'LSTM': lstm.predict(X_test_cnn, verbose=0).flatten(),\n", " 'Autoencoder': mse / mse.max() # Normalized MSE\n", "}\n", "\n", "colors = ['#4ecdc4', '#ff6b6b', '#ffe66d', '#95e1d3']\n", "for (name, prob), color in zip(probs.items(), colors):\n", " fpr, tpr, _ = roc_curve(y_test, prob)\n", " auc = results[name]['auc']\n", " plt.plot(fpr, tpr, label=f'{name} (AUC = {auc:.4f})', color=color, linewidth=2)\n", "\n", "plt.plot([0, 1], [0, 1], 'k--', alpha=0.5)\n", "plt.xlabel('False Positive Rate', fontsize=12)\n", "plt.ylabel('True Positive Rate', fontsize=12)\n", "plt.title('🎯 Deep Learning ROC Comparison', fontsize=14)\n", "plt.legend(loc='lower right')\n", "plt.grid(True, alpha=0.3)\n", "plt.tight_layout()\n", "plt.show()" ] }, { "cell_type": "code", "execution_count": null, "id": "ef891827", "metadata": {}, "outputs": [], "source": [ "# Training history visualization\n", "fig, axes = plt.subplots(1, 3, figsize=(15, 4))\n", "\n", "histories = [\n", " ('Transformer', history_transformer),\n", " ('CNN', history_cnn),\n", " ('LSTM', history_lstm)\n", "]\n", "\n", "for ax, (name, hist) in zip(axes, histories):\n", " ax.plot(hist.history['loss'], label='Train Loss')\n", " ax.plot(hist.history['val_loss'], label='Val Loss')\n", " ax.set_title(f'{name} Training', color='white')\n", " ax.set_xlabel('Epoch')\n", " ax.set_ylabel('Loss')\n", " ax.legend()\n", " ax.grid(True, alpha=0.3)\n", "\n", "plt.tight_layout()\n", "plt.show()" ] }, { "cell_type": "markdown", "id": "7871e52a", "metadata": {}, "source": [ "## 💾 Save Models" ] }, { "cell_type": "code", "execution_count": null, "id": "0d7755e9", "metadata": {}, "outputs": [], "source": [ "# Save trained models\n", "MODELS_DIR = Path.cwd().parent / 'models' / 'deep_learning'\n", "MODELS_DIR.mkdir(parents=True, exist_ok=True)\n", "\n", "print('💾 Saving models...')\n", "\n", "# Save Keras models\n", "transformer.save(MODELS_DIR / 'transformer_security.keras')\n", "cnn.save(MODELS_DIR / 'cnn_security.keras')\n", "lstm.save(MODELS_DIR / 'lstm_security.keras')\n", "autoencoder.save(MODELS_DIR / 'autoencoder_security.keras')\n", "encoder.save(MODELS_DIR / 'encoder_security.keras')\n", "\n", "# Save scaler and config\n", "joblib.dump(scaler, MODELS_DIR / 'scaler.pkl')\n", "joblib.dump(features, MODELS_DIR / 'feature_names.pkl')\n", "\n", "# Save metrics\n", "metrics = {\n", " name: {'accuracy': float(accuracy_score(y_test, r['pred'])),\n", " 'f1': float(f1_score(y_test, r['pred'])),\n", " 'auc': float(r['auc'])}\n", " for name, r in results.items()\n", "}\n", "with open(MODELS_DIR / 'metrics.json', 'w') as f:\n", " json.dump(metrics, f, indent=2)\n", "\n", "print(f'\\n✅ Models saved to {MODELS_DIR}')" ] }, { "cell_type": "markdown", "id": "765404ff", "metadata": {}, "source": [ "## 🎉 Summary\n", "\n", "### Trained Models:\n", "- **Transformer** - Attention-based classifier\n", "- **CNN** - Convolutional pattern detector\n", "- **LSTM** - Sequence analyzer\n", "- **Autoencoder** - Anomaly detector\n", "\n", "### Output Files:\n", "```\n", "models/deep_learning/\n", "├── transformer_security.keras\n", "├── cnn_security.keras\n", "├── lstm_security.keras\n", "├── autoencoder_security.keras\n", "├── encoder_security.keras\n", "├── scaler.pkl\n", "├── feature_names.pkl\n", "└── metrics.json\n", "```\n", "\n", "These models are ready for integration with the Agentic AI security system!" ] } ], "metadata": { "kernelspec": { "display_name": ".venv", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.15.0a3" } }, "nbformat": 4, "nbformat_minor": 5 }