{ "cells": [ { "cell_type": "code", "execution_count": null, "id": "eb248348", "metadata": {}, "outputs": [], "source": [ "# Banana Feature Prediction Model\n", "\n", "This notebook implements a machine learning model to predict:\n", "1. The number of seeds in a banana\n", "2. The curvature of the banana (in degrees)\n", "\n", "The predictions are based on various physical features. The model will be trained using a labeled dataset (to be added later)." ] }, { "cell_type": "markdown", "id": "4ae6978b", "metadata": {}, "source": [ "# Dataset Requirements\n", "\n", "## Image Data\n", "- Format: JPG or PNG\n", "- Resolution: 224x224 pixels (standard for many vision models)\n", "- Perspective: Side view of the banana\n", "- Background: Preferably solid color or simple background\n", "- Lighting: Consistent, well-lit conditions\n", "- File naming convention: `banana_[id].jpg`\n", "\n", "## Labels Dataset (CSV format)\n", "Required columns:\n", "1. `image_filename` (string): Name of the corresponding image file\n", "2. `length_cm` (float): Length of the banana in centimeters\n", "3. `width_cm` (float): Width at the middle point in centimeters\n", "4. `weight_g` (float): Weight in grams\n", "5. `ripeness` (int): Scale from 1-5 where:\n", " - 1: Very unripe (green)\n", " - 2: Unripe (more green than yellow)\n", " - 3: Ripe (yellow with few brown spots)\n", " - 4: Very ripe (yellow with many brown spots)\n", " - 5: Overripe (mostly brown)\n", "6. `color_code` (int): \n", " - 1: Green\n", " - 2: Yellow\n", " - 3: Brown\n", "7. `seed_count` (int): Actual number of seeds\n", "8. `curvature_degrees` (float): Measured angle of curvature\n", "\n", "Example CSV format:\n", "```csv\n", "image_filename,length_cm,width_cm,weight_g,ripeness,color_code,seed_count,curvature_degrees\n", "banana_001.jpg,15.2,3.1,125.5,3,2,8,45.2\n", "banana_002.jpg,14.8,2.9,118.3,4,2,12,38.7\n", "```\n", "\n", "## Measurement Guidelines\n", "1. Length: Measure along the outer curve from tip to tip\n", "2. Width: Measure at the widest point\n", "3. Curvature: Measure the angle between the stem and tip\n", "4. Seeds: Count after careful dissection\n", "5. Weight: Measure with peeled banana" ] }, { "cell_type": "code", "execution_count": null, "id": "4e231074", "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "code", "execution_count": 1, "id": "548585a2", "metadata": {}, "outputs": [], "source": [ "# Import required libraries\n", "import numpy as np\n", "import pandas as pd\n", "from sklearn.model_selection import train_test_split\n", "from sklearn.preprocessing import StandardScaler\n", "from sklearn.ensemble import RandomForestRegressor\n", "from sklearn.metrics import mean_squared_error, r2_score" ] }, { "cell_type": "code", "execution_count": 2, "id": "c8e3fa1e", "metadata": {}, "outputs": [], "source": [ "# Additional imports for image processing and deep learning\n", "import torch\n", "import torch.nn as nn\n", "import torch.optim as optim\n", "from torchvision import models, transforms\n", "from PIL import Image\n", "import cv2\n", "from pathlib import Path" ] }, { "cell_type": "code", "execution_count": 3, "id": "4dad3ff0", "metadata": {}, "outputs": [], "source": [ "# Define image preprocessing\n", "def preprocess_image(image_path):\n", " \"\"\"Preprocess image for the model\"\"\"\n", " transform = transforms.Compose([\n", " transforms.Resize((224, 224)),\n", " transforms.ToTensor(),\n", " transforms.Normalize(mean=[0.485, 0.456, 0.406],\n", " std=[0.229, 0.224, 0.225])\n", " ])\n", " \n", " image = Image.open(image_path).convert('RGB')\n", " return transform(image).unsqueeze(0)\n", "\n", "# Define the CNN model\n", "class BananaNet(nn.Module):\n", " def __init__(self):\n", " super(BananaNet, self).__init__()\n", " # Use pretrained ResNet18 as base model\n", " self.base_model = models.resnet18(pretrained=True)\n", " \n", " # Modify the final layer for our regression tasks\n", " num_features = self.base_model.fc.in_features\n", " self.base_model.fc = nn.Identity() # Remove final layer\n", " \n", " # Add custom layers for our specific predictions\n", " self.regression_head = nn.Sequential(\n", " nn.Linear(num_features, 512),\n", " nn.ReLU(),\n", " nn.Dropout(0.3),\n", " nn.Linear(512, 256),\n", " nn.ReLU(),\n", " nn.Dropout(0.3),\n", " nn.Linear(256, 2) # 2 outputs: seed_count and curvature\n", " )\n", " \n", " def forward(self, x):\n", " features = self.base_model(x)\n", " output = self.regression_head(features)\n", " return output # Returns [seed_count, curvature]" ] }, { "cell_type": "code", "execution_count": 4, "id": "6501a0e2", "metadata": {}, "outputs": [], "source": [ "# Function to make predictions from image\n", "def predict_from_image(model, image_path):\n", " \"\"\"\n", " Predict seed count and curvature from a banana image\n", " \n", " Args:\n", " model: Trained BananaNet model\n", " image_path: Path to the banana image\n", " \n", " Returns:\n", " dict: Predictions for seed count and curvature\n", " \"\"\"\n", " # Preprocess image\n", " image_tensor = preprocess_image(image_path)\n", " \n", " # Set model to evaluation mode\n", " model.eval()\n", " \n", " with torch.no_grad():\n", " # Get predictions\n", " predictions = model(image_tensor)\n", " \n", " # Extract predictions\n", " seed_count = int(round(predictions[0][0].item()))\n", " curvature = round(predictions[0][1].item(), 1)\n", " \n", " return {\n", " 'seeds': max(0, seed_count), # Ensure non-negative\n", " 'curvature': max(20, min(70, curvature)) # Clip to valid range\n", " }" ] }, { "cell_type": "code", "execution_count": null, "id": "9ca12bcf", "metadata": {}, "outputs": [], "source": [ "# Load and prepare the dataset\n", "class BananaDataset(torch.utils.data.Dataset):\n", " def __init__(self, csv_file, img_dir, transform=None):\n", " self.data = pd.read_csv(csv_file)\n", " self.img_dir = Path(img_dir)\n", " self.transform = transform\n", " \n", " def __len__(self):\n", " return len(self.data)\n", " \n", " def __getitem__(self, idx):\n", " # Get the image path and join it correctly with the base directory\n", " relative_img_path = self.data.iloc[idx]['image_filename']\n", " img_path = self.img_dir / relative_img_path\n", " \n", " try:\n", " image = Image.open(img_path).convert('RGB')\n", " except Exception as e:\n", " print(f\"Error loading image {img_path}: {str(e)}\")\n", " raise\n", " \n", " if self.transform:\n", " image = self.transform(image)\n", " \n", " # Get labels\n", " seed_count = self.data.iloc[idx]['seed_count']\n", " curvature = self.data.iloc[idx]['curvature_degrees']\n", " \n", " # Scale targets to new ranges for training using real min/max from the dataset\n", " # This ensures the model output matches the true range of your data and improves learning.\n", " seed_min, seed_max = 100, 450 # Realistic min/max for your dataset\n", " curv_min, curv_max = 170, 290 # Realistic min/max for your dataset\n", " seed_count = 350 + ((seed_count - seed_min) / (seed_max - seed_min)) * 100\n", " curvature = 180 + ((curvature - curv_min) / (curv_max - curv_min)) * 100\n", " \n", " labels = torch.tensor([seed_count, curvature], dtype=torch.float32)\n", " \n", " return image, labels\n", "\n", "# Data augmentation and normalization for training\n", "train_transform = transforms.Compose([\n", " transforms.Resize((224, 224)),\n", " transforms.RandomHorizontalFlip(),\n", " transforms.RandomRotation(10),\n", " transforms.ColorJitter(brightness=0.2, contrast=0.2),\n", " transforms.ToTensor(),\n", " transforms.Normalize(mean=[0.485, 0.456, 0.406],\n", " std=[0.229, 0.224, 0.225])\n", "])" ] }, { "cell_type": "code", "execution_count": 6, "id": "6ede33b2", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Verifying dataset...\n", "Warning: Image not found: training_data\\image\\p0.jpeg\n", "\n", "Found 31 valid images out of 32 entries\n", "Saved cleaned dataset to training_data/clean_dataset.csv\n" ] } ], "source": [ "# Verify and clean dataset\n", "print(\"Verifying dataset...\")\n", "df = pd.read_csv('training_data/dataset.csv')\n", "valid_images = []\n", "image_dir = Path('training_data')\n", "\n", "for idx, row in df.iterrows():\n", " img_path = image_dir / row['image_filename']\n", " if img_path.exists():\n", " valid_images.append(row)\n", " else:\n", " print(f\"Warning: Image not found: {img_path}\")\n", "\n", "# Create clean dataset\n", "clean_df = pd.DataFrame(valid_images)\n", "print(f\"\\nFound {len(clean_df)} valid images out of {len(df)} entries\")\n", "clean_df.to_csv('training_data/clean_dataset.csv', index=False)\n", "print(\"Saved cleaned dataset to training_data/clean_dataset.csv\")" ] }, { "cell_type": "code", "execution_count": 8, "id": "ddd72d6f", "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Loading cleaned dataset...\n", "\n", "Split dataset into 24 training and 7 validation samples\n", "\n", "Training on cpu\n", "==================================================\n", "\n", "Epoch 1/100\n", "----------------------------------------\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "c:\\Users\\amalb\\AppData\\Local\\Programs\\Python\\Python313\\Lib\\site-packages\\torchvision\\models\\_utils.py:208: UserWarning: The parameter 'pretrained' is deprecated since 0.13 and may be removed in the future, please use 'weights' instead.\n", " warnings.warn(\n", "c:\\Users\\amalb\\AppData\\Local\\Programs\\Python\\Python313\\Lib\\site-packages\\torchvision\\models\\_utils.py:223: UserWarning: Arguments other than a weight enum or `None` for 'weights' are deprecated since 0.13 and may be removed in the future. The current behavior is equivalent to passing `weights=ResNet18_Weights.IMAGENET1K_V1`. You can also use `weights=ResNet18_Weights.DEFAULT` to get the most up-to-date weights.\n", " warnings.warn(msg)\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ " Batch 2/6, Loss: 109269.8203\n", " Batch 4/6, Loss: 104654.4922\n", " Batch 4/6, Loss: 104654.4922\n", " Batch 6/6, Loss: 113989.2031\n", "\n", "Training Loss: 109183.0417\n", "Validation Loss: 106948.2109\n", "Saved new best model!\n", "==================================================\n", "\n", "Epoch 2/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 113989.2031\n", "\n", "Training Loss: 109183.0417\n", "Validation Loss: 106948.2109\n", "Saved new best model!\n", "==================================================\n", "\n", "Epoch 2/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 104122.2266\n", " Batch 2/6, Loss: 104122.2266\n", " Batch 4/6, Loss: 90400.1562\n", " Batch 4/6, Loss: 90400.1562\n", " Batch 6/6, Loss: 84159.4688\n", "\n", "Training Loss: 95006.3828\n", "Validation Loss: 74246.2109\n", "Saved new best model!\n", "==================================================\n", "\n", "Epoch 3/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 84159.4688\n", "\n", "Training Loss: 95006.3828\n", "Validation Loss: 74246.2109\n", "Saved new best model!\n", "==================================================\n", "\n", "Epoch 3/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 67734.2734\n", " Batch 2/6, Loss: 67734.2734\n", " Batch 4/6, Loss: 57842.9609\n", " Batch 4/6, Loss: 57842.9609\n", " Batch 6/6, Loss: 37662.7617\n", "\n", "Training Loss: 58160.4733\n", "Validation Loss: 12912.5962\n", "Saved new best model!\n", "==================================================\n", "\n", "Epoch 4/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 37662.7617\n", "\n", "Training Loss: 58160.4733\n", "Validation Loss: 12912.5962\n", "Saved new best model!\n", "==================================================\n", "\n", "Epoch 4/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 19452.2617\n", " Batch 2/6, Loss: 19452.2617\n", " Batch 4/6, Loss: 6435.4668\n", " Batch 4/6, Loss: 6435.4668\n", " Batch 6/6, Loss: 3268.4731\n", "\n", "Training Loss: 10538.4987\n", "Validation Loss: 45018.8711\n", "==================================================\n", "\n", "Epoch 5/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 3268.4731\n", "\n", "Training Loss: 10538.4987\n", "Validation Loss: 45018.8711\n", "==================================================\n", "\n", "Epoch 5/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 16004.5322\n", " Batch 2/6, Loss: 16004.5322\n", " Batch 4/6, Loss: 8597.3604\n", " Batch 4/6, Loss: 8597.3604\n", " Batch 6/6, Loss: 1547.1194\n", "\n", "Training Loss: 8467.1116\n", "Validation Loss: 16871.5425\n", "==================================================\n", "\n", "Epoch 6/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1547.1194\n", "\n", "Training Loss: 8467.1116\n", "Validation Loss: 16871.5425\n", "==================================================\n", "\n", "Epoch 6/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1839.7588\n", " Batch 2/6, Loss: 1839.7588\n", " Batch 4/6, Loss: 423.2393\n", " Batch 4/6, Loss: 423.2393\n", " Batch 6/6, Loss: 2926.3755\n", "\n", "Training Loss: 1664.0861\n", "Validation Loss: 7083.2922\n", "Saved new best model!\n", "==================================================\n", "\n", "Epoch 7/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 2926.3755\n", "\n", "Training Loss: 1664.0861\n", "Validation Loss: 7083.2922\n", "Saved new best model!\n", "==================================================\n", "\n", "Epoch 7/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 2207.4912\n", " Batch 2/6, Loss: 2207.4912\n", " Batch 4/6, Loss: 1276.8641\n", " Batch 4/6, Loss: 1276.8641\n", " Batch 6/6, Loss: 1323.9146\n", "\n", "Training Loss: 1642.5232\n", "Validation Loss: 17078.7827\n", "==================================================\n", "\n", "Epoch 8/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1323.9146\n", "\n", "Training Loss: 1642.5232\n", "Validation Loss: 17078.7827\n", "==================================================\n", "\n", "Epoch 8/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1635.6317\n", " Batch 2/6, Loss: 1635.6317\n", " Batch 4/6, Loss: 1087.8344\n", " Batch 4/6, Loss: 1087.8344\n", " Batch 6/6, Loss: 707.8613\n", "\n", "Training Loss: 1504.2324\n", "Validation Loss: 12043.7065\n", "==================================================\n", "\n", "Epoch 9/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 707.8613\n", "\n", "Training Loss: 1504.2324\n", "Validation Loss: 12043.7065\n", "==================================================\n", "\n", "Epoch 9/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 370.2495\n", " Batch 2/6, Loss: 370.2495\n", " Batch 4/6, Loss: 1176.6290\n", " Batch 4/6, Loss: 1176.6290\n", " Batch 6/6, Loss: 834.0784\n", "\n", "Training Loss: 1126.5365\n", "Validation Loss: 3363.5264\n", "Saved new best model!\n", "==================================================\n", "\n", "Epoch 10/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 834.0784\n", "\n", "Training Loss: 1126.5365\n", "Validation Loss: 3363.5264\n", "Saved new best model!\n", "==================================================\n", "\n", "Epoch 10/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 955.1545\n", " Batch 2/6, Loss: 955.1545\n", " Batch 4/6, Loss: 1554.7657\n", " Batch 4/6, Loss: 1554.7657\n", " Batch 6/6, Loss: 2333.2275\n", "\n", "Training Loss: 1782.2497\n", "Validation Loss: 287.9861\n", "Saved new best model!\n", "==================================================\n", "\n", "Epoch 11/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 2333.2275\n", "\n", "Training Loss: 1782.2497\n", "Validation Loss: 287.9861\n", "Saved new best model!\n", "==================================================\n", "\n", "Epoch 11/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 694.0063\n", " Batch 2/6, Loss: 694.0063\n", " Batch 4/6, Loss: 1016.2342\n", " Batch 4/6, Loss: 1016.2342\n", " Batch 6/6, Loss: 317.5279\n", "\n", "Training Loss: 814.0098\n", "Validation Loss: 363.4813\n", "==================================================\n", "\n", "Epoch 12/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 317.5279\n", "\n", "Training Loss: 814.0098\n", "Validation Loss: 363.4813\n", "==================================================\n", "\n", "Epoch 12/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 396.8125\n", " Batch 2/6, Loss: 396.8125\n", " Batch 4/6, Loss: 2834.8699\n", " Batch 4/6, Loss: 2834.8699\n", " Batch 6/6, Loss: 554.5372\n", "\n", "Training Loss: 919.0196\n", "Validation Loss: 839.4999\n", "==================================================\n", "\n", "Epoch 13/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 554.5372\n", "\n", "Training Loss: 919.0196\n", "Validation Loss: 839.4999\n", "==================================================\n", "\n", "Epoch 13/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1508.1024\n", " Batch 2/6, Loss: 1508.1024\n", " Batch 4/6, Loss: 1642.3060\n", " Batch 4/6, Loss: 1642.3060\n", " Batch 6/6, Loss: 1333.8613\n", "\n", "Training Loss: 1448.0772\n", "Validation Loss: 673.2234\n", "==================================================\n", "\n", "Epoch 14/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1333.8613\n", "\n", "Training Loss: 1448.0772\n", "Validation Loss: 673.2234\n", "==================================================\n", "\n", "Epoch 14/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1589.7112\n", " Batch 2/6, Loss: 1589.7112\n", " Batch 4/6, Loss: 880.0350\n", " Batch 4/6, Loss: 880.0350\n", " Batch 6/6, Loss: 565.2096\n", "\n", "Training Loss: 908.3141\n", "Validation Loss: 481.6184\n", "==================================================\n", "\n", "Epoch 15/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 565.2096\n", "\n", "Training Loss: 908.3141\n", "Validation Loss: 481.6184\n", "==================================================\n", "\n", "Epoch 15/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 786.9286\n", " Batch 2/6, Loss: 786.9286\n", " Batch 4/6, Loss: 319.4578\n", " Batch 4/6, Loss: 319.4578\n", " Batch 6/6, Loss: 505.0421\n", "\n", "Training Loss: 694.5936\n", "Validation Loss: 258.8256\n", "Saved new best model!\n", "==================================================\n", "\n", "Epoch 16/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 505.0421\n", "\n", "Training Loss: 694.5936\n", "Validation Loss: 258.8256\n", "Saved new best model!\n", "==================================================\n", "\n", "Epoch 16/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1875.3506\n", " Batch 2/6, Loss: 1875.3506\n", " Batch 4/6, Loss: 950.2728\n", " Batch 4/6, Loss: 950.2728\n", " Batch 6/6, Loss: 489.4273\n", "\n", "Training Loss: 1013.2499\n", "Validation Loss: 530.2073\n", "==================================================\n", "\n", "Epoch 17/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 489.4273\n", "\n", "Training Loss: 1013.2499\n", "Validation Loss: 530.2073\n", "==================================================\n", "\n", "Epoch 17/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 424.1327\n", " Batch 2/6, Loss: 424.1327\n", " Batch 4/6, Loss: 694.6185\n", " Batch 4/6, Loss: 694.6185\n", " Batch 6/6, Loss: 658.2825\n", "\n", "Training Loss: 1014.5915\n", "Validation Loss: 406.1135\n", "==================================================\n", "\n", "Epoch 18/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 658.2825\n", "\n", "Training Loss: 1014.5915\n", "Validation Loss: 406.1135\n", "==================================================\n", "\n", "Epoch 18/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 829.6434\n", " Batch 2/6, Loss: 829.6434\n", " Batch 4/6, Loss: 644.1646\n", " Batch 4/6, Loss: 644.1646\n", " Batch 6/6, Loss: 329.4149\n", "\n", "Training Loss: 683.5819\n", "Validation Loss: 389.7952\n", "==================================================\n", "\n", "Epoch 19/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 329.4149\n", "\n", "Training Loss: 683.5819\n", "Validation Loss: 389.7952\n", "==================================================\n", "\n", "Epoch 19/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1293.7899\n", " Batch 2/6, Loss: 1293.7899\n", " Batch 4/6, Loss: 206.5541\n", " Batch 4/6, Loss: 206.5541\n", " Batch 6/6, Loss: 897.7126\n", "\n", "Training Loss: 924.1091\n", "Validation Loss: 943.3143\n", "==================================================\n", "\n", "Epoch 20/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 897.7126\n", "\n", "Training Loss: 924.1091\n", "Validation Loss: 943.3143\n", "==================================================\n", "\n", "Epoch 20/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1068.2131\n", " Batch 2/6, Loss: 1068.2131\n", " Batch 4/6, Loss: 402.7136\n", " Batch 4/6, Loss: 402.7136\n", " Batch 6/6, Loss: 866.8699\n", "\n", "Training Loss: 780.4582\n", "Validation Loss: 240.9058\n", "Saved new best model!\n", "==================================================\n", "\n", "Epoch 21/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 866.8699\n", "\n", "Training Loss: 780.4582\n", "Validation Loss: 240.9058\n", "Saved new best model!\n", "==================================================\n", "\n", "Epoch 21/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 3343.7578\n", " Batch 2/6, Loss: 3343.7578\n", " Batch 4/6, Loss: 1420.2560\n", " Batch 4/6, Loss: 1420.2560\n", " Batch 6/6, Loss: 381.7576\n", "\n", "Training Loss: 1414.1050\n", "Validation Loss: 700.4522\n", "==================================================\n", "\n", "Epoch 22/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 381.7576\n", "\n", "Training Loss: 1414.1050\n", "Validation Loss: 700.4522\n", "==================================================\n", "\n", "Epoch 22/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 129.3188\n", " Batch 2/6, Loss: 129.3188\n", " Batch 4/6, Loss: 1145.3645\n", " Batch 4/6, Loss: 1145.3645\n", " Batch 6/6, Loss: 325.5650\n", "\n", "Training Loss: 771.6898\n", "Validation Loss: 245.0782\n", "==================================================\n", "\n", "Epoch 23/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 325.5650\n", "\n", "Training Loss: 771.6898\n", "Validation Loss: 245.0782\n", "==================================================\n", "\n", "Epoch 23/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1226.8917\n", " Batch 2/6, Loss: 1226.8917\n", " Batch 4/6, Loss: 483.1090\n", " Batch 4/6, Loss: 483.1090\n", " Batch 6/6, Loss: 479.1373\n", "\n", "Training Loss: 766.2635\n", "Validation Loss: 244.3375\n", "==================================================\n", "\n", "Epoch 24/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 479.1373\n", "\n", "Training Loss: 766.2635\n", "Validation Loss: 244.3375\n", "==================================================\n", "\n", "Epoch 24/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 140.8551\n", " Batch 2/6, Loss: 140.8551\n", " Batch 4/6, Loss: 1785.3615\n", " Batch 4/6, Loss: 1785.3615\n", " Batch 6/6, Loss: 1006.9538\n", "\n", "Training Loss: 1085.7039\n", "Validation Loss: 215.9444\n", "Saved new best model!\n", "==================================================\n", "\n", "Epoch 25/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1006.9538\n", "\n", "Training Loss: 1085.7039\n", "Validation Loss: 215.9444\n", "Saved new best model!\n", "==================================================\n", "\n", "Epoch 25/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1130.1061\n", " Batch 2/6, Loss: 1130.1061\n", " Batch 4/6, Loss: 393.7478\n", " Batch 4/6, Loss: 393.7478\n", " Batch 6/6, Loss: 415.8807\n", "\n", "Training Loss: 624.9323\n", "Validation Loss: 220.6329\n", "==================================================\n", "\n", "Epoch 26/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 415.8807\n", "\n", "Training Loss: 624.9323\n", "Validation Loss: 220.6329\n", "==================================================\n", "\n", "Epoch 26/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1184.3910\n", " Batch 2/6, Loss: 1184.3910\n", " Batch 4/6, Loss: 521.7990\n", " Batch 4/6, Loss: 521.7990\n", " Batch 6/6, Loss: 507.1132\n", "\n", "Training Loss: 902.3073\n", "Validation Loss: 541.2622\n", "==================================================\n", "\n", "Epoch 27/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 507.1132\n", "\n", "Training Loss: 902.3073\n", "Validation Loss: 541.2622\n", "==================================================\n", "\n", "Epoch 27/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1286.0627\n", " Batch 2/6, Loss: 1286.0627\n", " Batch 4/6, Loss: 476.0831\n", " Batch 4/6, Loss: 476.0831\n", " Batch 6/6, Loss: 903.5569\n", "\n", "Training Loss: 857.9197\n", "Validation Loss: 241.2860\n", "==================================================\n", "\n", "Epoch 28/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 903.5569\n", "\n", "Training Loss: 857.9197\n", "Validation Loss: 241.2860\n", "==================================================\n", "\n", "Epoch 28/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1162.8234\n", " Batch 2/6, Loss: 1162.8234\n", " Batch 4/6, Loss: 2393.5278\n", " Batch 4/6, Loss: 2393.5278\n", " Batch 6/6, Loss: 926.7454\n", "\n", "Training Loss: 1437.3938\n", "Validation Loss: 218.4794\n", "==================================================\n", "\n", "Epoch 29/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 926.7454\n", "\n", "Training Loss: 1437.3938\n", "Validation Loss: 218.4794\n", "==================================================\n", "\n", "Epoch 29/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 345.2092\n", " Batch 2/6, Loss: 345.2092\n", " Batch 4/6, Loss: 2720.5051\n", " Batch 4/6, Loss: 2720.5051\n", " Batch 6/6, Loss: 1461.9178\n", "\n", "Training Loss: 1884.8524\n", "Validation Loss: 969.7208\n", "==================================================\n", "\n", "Epoch 30/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1461.9178\n", "\n", "Training Loss: 1884.8524\n", "Validation Loss: 969.7208\n", "==================================================\n", "\n", "Epoch 30/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1515.7988\n", " Batch 2/6, Loss: 1515.7988\n", " Batch 4/6, Loss: 424.1443\n", " Batch 4/6, Loss: 424.1443\n", " Batch 6/6, Loss: 800.8522\n", "\n", "Training Loss: 799.1393\n", "Validation Loss: 286.8963\n", "==================================================\n", "\n", "Epoch 31/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 800.8522\n", "\n", "Training Loss: 799.1393\n", "Validation Loss: 286.8963\n", "==================================================\n", "\n", "Epoch 31/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1050.5314\n", " Batch 2/6, Loss: 1050.5314\n", " Batch 4/6, Loss: 1596.0892\n", " Batch 4/6, Loss: 1596.0892\n", " Batch 6/6, Loss: 784.2861\n", "\n", "Training Loss: 1376.6739\n", "Validation Loss: 262.6435\n", "==================================================\n", "\n", "Epoch 32/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 784.2861\n", "\n", "Training Loss: 1376.6739\n", "Validation Loss: 262.6435\n", "==================================================\n", "\n", "Epoch 32/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1964.1541\n", " Batch 2/6, Loss: 1964.1541\n", " Batch 4/6, Loss: 977.8395\n", " Batch 4/6, Loss: 977.8395\n", " Batch 6/6, Loss: 1104.2291\n", "\n", "Training Loss: 1123.8866\n", "Validation Loss: 319.2892\n", "==================================================\n", "\n", "Epoch 33/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1104.2291\n", "\n", "Training Loss: 1123.8866\n", "Validation Loss: 319.2892\n", "==================================================\n", "\n", "Epoch 33/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1275.9658\n", " Batch 2/6, Loss: 1275.9658\n", " Batch 4/6, Loss: 5549.7285\n", " Batch 4/6, Loss: 5549.7285\n", " Batch 6/6, Loss: 862.2293\n", "\n", "Training Loss: 1589.4808\n", "Validation Loss: 330.8620\n", "==================================================\n", "\n", "Epoch 34/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 862.2293\n", "\n", "Training Loss: 1589.4808\n", "Validation Loss: 330.8620\n", "==================================================\n", "\n", "Epoch 34/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 322.3564\n", " Batch 2/6, Loss: 322.3564\n", " Batch 4/6, Loss: 190.5795\n", " Batch 4/6, Loss: 190.5795\n", " Batch 6/6, Loss: 992.4006\n", "\n", "Training Loss: 608.0670\n", "Validation Loss: 352.2329\n", "==================================================\n", "\n", "Epoch 35/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 992.4006\n", "\n", "Training Loss: 608.0670\n", "Validation Loss: 352.2329\n", "==================================================\n", "\n", "Epoch 35/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 425.0992\n", " Batch 2/6, Loss: 425.0992\n", " Batch 4/6, Loss: 2652.0706\n", " Batch 4/6, Loss: 2652.0706\n", " Batch 6/6, Loss: 1569.4828\n", "\n", "Training Loss: 1220.0820\n", "Validation Loss: 222.9377\n", "==================================================\n", "\n", "Epoch 36/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1569.4828\n", "\n", "Training Loss: 1220.0820\n", "Validation Loss: 222.9377\n", "==================================================\n", "\n", "Epoch 36/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 945.8361\n", " Batch 2/6, Loss: 945.8361\n", " Batch 4/6, Loss: 568.6164\n", " Batch 4/6, Loss: 568.6164\n", " Batch 6/6, Loss: 1113.9171\n", "\n", "Training Loss: 1162.5473\n", "Validation Loss: 210.0292\n", "Saved new best model!\n", "==================================================\n", "\n", "Epoch 37/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1113.9171\n", "\n", "Training Loss: 1162.5473\n", "Validation Loss: 210.0292\n", "Saved new best model!\n", "==================================================\n", "\n", "Epoch 37/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 703.2250\n", " Batch 2/6, Loss: 703.2250\n", " Batch 4/6, Loss: 902.1324\n", " Batch 4/6, Loss: 902.1324\n", " Batch 6/6, Loss: 613.5023\n", "\n", "Training Loss: 933.4947\n", "Validation Loss: 243.4085\n", "==================================================\n", "\n", "Epoch 38/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 613.5023\n", "\n", "Training Loss: 933.4947\n", "Validation Loss: 243.4085\n", "==================================================\n", "\n", "Epoch 38/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1124.1021\n", " Batch 2/6, Loss: 1124.1021\n", " Batch 4/6, Loss: 1759.7264\n", " Batch 4/6, Loss: 1759.7264\n", " Batch 6/6, Loss: 749.5591\n", "\n", "Training Loss: 978.4406\n", "Validation Loss: 254.2173\n", "==================================================\n", "\n", "Epoch 39/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 749.5591\n", "\n", "Training Loss: 978.4406\n", "Validation Loss: 254.2173\n", "==================================================\n", "\n", "Epoch 39/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 418.9374\n", " Batch 2/6, Loss: 418.9374\n", " Batch 4/6, Loss: 426.1255\n", " Batch 4/6, Loss: 426.1255\n", " Batch 6/6, Loss: 528.7054\n", "\n", "Training Loss: 681.8176\n", "Validation Loss: 418.5006\n", "==================================================\n", "\n", "Epoch 40/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 528.7054\n", "\n", "Training Loss: 681.8176\n", "Validation Loss: 418.5006\n", "==================================================\n", "\n", "Epoch 40/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 658.8388\n", " Batch 2/6, Loss: 658.8388\n", " Batch 4/6, Loss: 1438.9272\n", " Batch 4/6, Loss: 1438.9272\n", " Batch 6/6, Loss: 1576.4991\n", "\n", "Training Loss: 1311.9740\n", "Validation Loss: 200.4713\n", "Saved new best model!\n", "==================================================\n", "\n", "Epoch 41/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1576.4991\n", "\n", "Training Loss: 1311.9740\n", "Validation Loss: 200.4713\n", "Saved new best model!\n", "==================================================\n", "\n", "Epoch 41/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 741.4327\n", " Batch 2/6, Loss: 741.4327\n", " Batch 4/6, Loss: 483.5253\n", " Batch 4/6, Loss: 483.5253\n", " Batch 6/6, Loss: 808.9166\n", "\n", "Training Loss: 1153.6412\n", "Validation Loss: 452.7481\n", "==================================================\n", "\n", "Epoch 42/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 808.9166\n", "\n", "Training Loss: 1153.6412\n", "Validation Loss: 452.7481\n", "==================================================\n", "\n", "Epoch 42/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 452.8772\n", " Batch 2/6, Loss: 452.8772\n", " Batch 4/6, Loss: 2500.1050\n", " Batch 4/6, Loss: 2500.1050\n", " Batch 6/6, Loss: 1510.4575\n", "\n", "Training Loss: 1118.9968\n", "Validation Loss: 356.2476\n", "==================================================\n", "\n", "Epoch 43/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1510.4575\n", "\n", "Training Loss: 1118.9968\n", "Validation Loss: 356.2476\n", "==================================================\n", "\n", "Epoch 43/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 643.3315\n", " Batch 2/6, Loss: 643.3315\n", " Batch 4/6, Loss: 385.2501\n", " Batch 4/6, Loss: 385.2501\n", " Batch 6/6, Loss: 247.5821\n", "\n", "Training Loss: 1070.5517\n", "Validation Loss: 1392.2087\n", "==================================================\n", "\n", "Epoch 44/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 247.5821\n", "\n", "Training Loss: 1070.5517\n", "Validation Loss: 1392.2087\n", "==================================================\n", "\n", "Epoch 44/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1814.5935\n", " Batch 2/6, Loss: 1814.5935\n", " Batch 4/6, Loss: 758.2944\n", " Batch 4/6, Loss: 758.2944\n", " Batch 6/6, Loss: 554.7292\n", "\n", "Training Loss: 1062.7365\n", "Validation Loss: 1887.5235\n", "==================================================\n", "\n", "Epoch 45/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 554.7292\n", "\n", "Training Loss: 1062.7365\n", "Validation Loss: 1887.5235\n", "==================================================\n", "\n", "Epoch 45/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1073.3751\n", " Batch 2/6, Loss: 1073.3751\n", " Batch 4/6, Loss: 1061.9996\n", " Batch 4/6, Loss: 1061.9996\n", " Batch 6/6, Loss: 712.1802\n", "\n", "Training Loss: 749.9939\n", "Validation Loss: 785.1534\n", "==================================================\n", "\n", "Epoch 46/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 712.1802\n", "\n", "Training Loss: 749.9939\n", "Validation Loss: 785.1534\n", "==================================================\n", "\n", "Epoch 46/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 532.6222\n", " Batch 2/6, Loss: 532.6222\n", " Batch 4/6, Loss: 210.9791\n", " Batch 4/6, Loss: 210.9791\n", " Batch 6/6, Loss: 2107.3486\n", "\n", "Training Loss: 750.4574\n", "Validation Loss: 210.9181\n", "==================================================\n", "\n", "Epoch 47/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 2107.3486\n", "\n", "Training Loss: 750.4574\n", "Validation Loss: 210.9181\n", "==================================================\n", "\n", "Epoch 47/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 786.8726\n", " Batch 2/6, Loss: 786.8726\n", " Batch 4/6, Loss: 454.9923\n", " Batch 4/6, Loss: 454.9923\n", " Batch 6/6, Loss: 655.9498\n", "\n", "Training Loss: 856.6683\n", "Validation Loss: 471.7114\n", "==================================================\n", "\n", "Epoch 48/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 655.9498\n", "\n", "Training Loss: 856.6683\n", "Validation Loss: 471.7114\n", "==================================================\n", "\n", "Epoch 48/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 3211.0278\n", " Batch 2/6, Loss: 3211.0278\n", " Batch 4/6, Loss: 1642.7689\n", " Batch 4/6, Loss: 1642.7689\n", " Batch 6/6, Loss: 2075.4155\n", "\n", "Training Loss: 2025.5602\n", "Validation Loss: 443.3635\n", "==================================================\n", "\n", "Epoch 49/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 2075.4155\n", "\n", "Training Loss: 2025.5602\n", "Validation Loss: 443.3635\n", "==================================================\n", "\n", "Epoch 49/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 2159.5000\n", " Batch 2/6, Loss: 2159.5000\n", " Batch 4/6, Loss: 320.8261\n", " Batch 4/6, Loss: 320.8261\n", " Batch 6/6, Loss: 1513.5872\n", "\n", "Training Loss: 1345.8962\n", "Validation Loss: 380.2266\n", "==================================================\n", "\n", "Epoch 50/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1513.5872\n", "\n", "Training Loss: 1345.8962\n", "Validation Loss: 380.2266\n", "==================================================\n", "\n", "Epoch 50/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 730.8304\n", " Batch 2/6, Loss: 730.8304\n", " Batch 4/6, Loss: 1655.5721\n", " Batch 4/6, Loss: 1655.5721\n", " Batch 6/6, Loss: 1218.2632\n", "\n", "Training Loss: 1320.7316\n", "Validation Loss: 259.1231\n", "==================================================\n", "\n", "Epoch 51/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1218.2632\n", "\n", "Training Loss: 1320.7316\n", "Validation Loss: 259.1231\n", "==================================================\n", "\n", "Epoch 51/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 2737.3723\n", " Batch 2/6, Loss: 2737.3723\n", " Batch 4/6, Loss: 612.8956\n", " Batch 4/6, Loss: 612.8956\n", " Batch 6/6, Loss: 1196.5129\n", "\n", "Training Loss: 1024.5598\n", "Validation Loss: 298.5450\n", "==================================================\n", "\n", "Epoch 52/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1196.5129\n", "\n", "Training Loss: 1024.5598\n", "Validation Loss: 298.5450\n", "==================================================\n", "\n", "Epoch 52/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 2586.7422\n", " Batch 2/6, Loss: 2586.7422\n", " Batch 4/6, Loss: 340.5244\n", " Batch 4/6, Loss: 340.5244\n", " Batch 6/6, Loss: 1269.7654\n", "\n", "Training Loss: 1119.3062\n", "Validation Loss: 400.0172\n", "==================================================\n", "\n", "Epoch 53/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1269.7654\n", "\n", "Training Loss: 1119.3062\n", "Validation Loss: 400.0172\n", "==================================================\n", "\n", "Epoch 53/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 312.1550\n", " Batch 2/6, Loss: 312.1550\n", " Batch 4/6, Loss: 1334.2424\n", " Batch 4/6, Loss: 1334.2424\n", " Batch 6/6, Loss: 1312.9694\n", "\n", "Training Loss: 922.2774\n", "Validation Loss: 299.0278\n", "==================================================\n", "\n", "Epoch 54/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1312.9694\n", "\n", "Training Loss: 922.2774\n", "Validation Loss: 299.0278\n", "==================================================\n", "\n", "Epoch 54/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 429.4738\n", " Batch 2/6, Loss: 429.4738\n", " Batch 4/6, Loss: 2040.6050\n", " Batch 4/6, Loss: 2040.6050\n", " Batch 6/6, Loss: 2036.3914\n", "\n", "Training Loss: 1546.0343\n", "Validation Loss: 364.7983\n", "==================================================\n", "\n", "Epoch 55/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 2036.3914\n", "\n", "Training Loss: 1546.0343\n", "Validation Loss: 364.7983\n", "==================================================\n", "\n", "Epoch 55/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 467.2721\n", " Batch 2/6, Loss: 467.2721\n", " Batch 4/6, Loss: 260.9136\n", " Batch 4/6, Loss: 260.9136\n", " Batch 6/6, Loss: 1281.7625\n", "\n", "Training Loss: 650.1836\n", "Validation Loss: 345.3493\n", "==================================================\n", "\n", "Epoch 56/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1281.7625\n", "\n", "Training Loss: 650.1836\n", "Validation Loss: 345.3493\n", "==================================================\n", "\n", "Epoch 56/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 996.8378\n", " Batch 2/6, Loss: 996.8378\n", " Batch 4/6, Loss: 844.3442\n", " Batch 4/6, Loss: 844.3442\n", " Batch 6/6, Loss: 2091.5588\n", "\n", "Training Loss: 1019.7203\n", "Validation Loss: 235.4854\n", "==================================================\n", "\n", "Epoch 57/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 2091.5588\n", "\n", "Training Loss: 1019.7203\n", "Validation Loss: 235.4854\n", "==================================================\n", "\n", "Epoch 57/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1286.6467\n", " Batch 2/6, Loss: 1286.6467\n", " Batch 4/6, Loss: 417.1038\n", " Batch 4/6, Loss: 417.1038\n", " Batch 6/6, Loss: 175.3365\n", "\n", "Training Loss: 827.2888\n", "Validation Loss: 289.4587\n", "==================================================\n", "\n", "Epoch 58/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 175.3365\n", "\n", "Training Loss: 827.2888\n", "Validation Loss: 289.4587\n", "==================================================\n", "\n", "Epoch 58/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 919.0460\n", " Batch 2/6, Loss: 919.0460\n", " Batch 4/6, Loss: 965.7724\n", " Batch 4/6, Loss: 965.7724\n", " Batch 6/6, Loss: 316.2888\n", "\n", "Training Loss: 1201.1364\n", "Validation Loss: 295.0055\n", "==================================================\n", "\n", "Epoch 59/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 316.2888\n", "\n", "Training Loss: 1201.1364\n", "Validation Loss: 295.0055\n", "==================================================\n", "\n", "Epoch 59/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 861.2559\n", " Batch 2/6, Loss: 861.2559\n", " Batch 4/6, Loss: 1265.3905\n", " Batch 4/6, Loss: 1265.3905\n", " Batch 6/6, Loss: 1755.4694\n", "\n", "Training Loss: 1177.1436\n", "Validation Loss: 429.0676\n", "==================================================\n", "\n", "Epoch 60/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1755.4694\n", "\n", "Training Loss: 1177.1436\n", "Validation Loss: 429.0676\n", "==================================================\n", "\n", "Epoch 60/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 2777.2812\n", " Batch 2/6, Loss: 2777.2812\n", " Batch 4/6, Loss: 632.5978\n", " Batch 4/6, Loss: 632.5978\n", " Batch 6/6, Loss: 806.3088\n", "\n", "Training Loss: 1529.9540\n", "Validation Loss: 309.5401\n", "==================================================\n", "\n", "Epoch 61/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 806.3088\n", "\n", "Training Loss: 1529.9540\n", "Validation Loss: 309.5401\n", "==================================================\n", "\n", "Epoch 61/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1041.1650\n", " Batch 2/6, Loss: 1041.1650\n", " Batch 4/6, Loss: 1586.5848\n", " Batch 4/6, Loss: 1586.5848\n", " Batch 6/6, Loss: 807.8189\n", "\n", "Training Loss: 852.1445\n", "Validation Loss: 253.4920\n", "==================================================\n", "\n", "Epoch 62/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 807.8189\n", "\n", "Training Loss: 852.1445\n", "Validation Loss: 253.4920\n", "==================================================\n", "\n", "Epoch 62/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1260.4893\n", " Batch 2/6, Loss: 1260.4893\n", " Batch 4/6, Loss: 396.4262\n", " Batch 4/6, Loss: 396.4262\n", " Batch 6/6, Loss: 195.5327\n", "\n", "Training Loss: 517.2712\n", "Validation Loss: 289.6780\n", "==================================================\n", "\n", "Epoch 63/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 195.5327\n", "\n", "Training Loss: 517.2712\n", "Validation Loss: 289.6780\n", "==================================================\n", "\n", "Epoch 63/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 491.4989\n", " Batch 2/6, Loss: 491.4989\n", " Batch 4/6, Loss: 2093.6799\n", " Batch 4/6, Loss: 2093.6799\n", " Batch 6/6, Loss: 1543.6222\n", "\n", "Training Loss: 974.9110\n", "Validation Loss: 477.5977\n", "==================================================\n", "\n", "Epoch 64/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1543.6222\n", "\n", "Training Loss: 974.9110\n", "Validation Loss: 477.5977\n", "==================================================\n", "\n", "Epoch 64/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 741.2250\n", " Batch 2/6, Loss: 741.2250\n", " Batch 4/6, Loss: 1103.4115\n", " Batch 4/6, Loss: 1103.4115\n", " Batch 6/6, Loss: 850.4505\n", "\n", "Training Loss: 1072.9318\n", "Validation Loss: 236.4798\n", "==================================================\n", "\n", "Epoch 65/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 850.4505\n", "\n", "Training Loss: 1072.9318\n", "Validation Loss: 236.4798\n", "==================================================\n", "\n", "Epoch 65/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1601.4633\n", " Batch 2/6, Loss: 1601.4633\n", " Batch 4/6, Loss: 1000.5103\n", " Batch 4/6, Loss: 1000.5103\n", " Batch 6/6, Loss: 1174.9192\n", "\n", "Training Loss: 1053.1631\n", "Validation Loss: 216.6086\n", "==================================================\n", "\n", "Epoch 66/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1174.9192\n", "\n", "Training Loss: 1053.1631\n", "Validation Loss: 216.6086\n", "==================================================\n", "\n", "Epoch 66/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1349.9268\n", " Batch 2/6, Loss: 1349.9268\n", " Batch 4/6, Loss: 858.4254\n", " Batch 4/6, Loss: 858.4254\n", " Batch 6/6, Loss: 846.1974\n", "\n", "Training Loss: 891.6657\n", "Validation Loss: 506.2389\n", "==================================================\n", "\n", "Epoch 67/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 846.1974\n", "\n", "Training Loss: 891.6657\n", "Validation Loss: 506.2389\n", "==================================================\n", "\n", "Epoch 67/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1426.5281\n", " Batch 2/6, Loss: 1426.5281\n", " Batch 4/6, Loss: 553.9657\n", " Batch 4/6, Loss: 553.9657\n", " Batch 6/6, Loss: 1804.3223\n", "\n", "Training Loss: 1410.6379\n", "Validation Loss: 584.2383\n", "==================================================\n", "\n", "Epoch 68/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1804.3223\n", "\n", "Training Loss: 1410.6379\n", "Validation Loss: 584.2383\n", "==================================================\n", "\n", "Epoch 68/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 932.2189\n", " Batch 2/6, Loss: 932.2189\n", " Batch 4/6, Loss: 744.6321\n", " Batch 4/6, Loss: 744.6321\n", " Batch 6/6, Loss: 562.3445\n", "\n", "Training Loss: 1138.0649\n", "Validation Loss: 459.6956\n", "==================================================\n", "\n", "Epoch 69/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 562.3445\n", "\n", "Training Loss: 1138.0649\n", "Validation Loss: 459.6956\n", "==================================================\n", "\n", "Epoch 69/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1892.1584\n", " Batch 2/6, Loss: 1892.1584\n", " Batch 4/6, Loss: 1386.6692\n", " Batch 4/6, Loss: 1386.6692\n", " Batch 6/6, Loss: 694.9966\n", "\n", "Training Loss: 1194.2322\n", "Validation Loss: 394.8461\n", "==================================================\n", "\n", "Epoch 70/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 694.9966\n", "\n", "Training Loss: 1194.2322\n", "Validation Loss: 394.8461\n", "==================================================\n", "\n", "Epoch 70/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1495.4666\n", " Batch 2/6, Loss: 1495.4666\n", " Batch 4/6, Loss: 2141.7749\n", " Batch 4/6, Loss: 2141.7749\n", " Batch 6/6, Loss: 1136.0317\n", "\n", "Training Loss: 1154.5981\n", "Validation Loss: 813.4258\n", "==================================================\n", "\n", "Epoch 71/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1136.0317\n", "\n", "Training Loss: 1154.5981\n", "Validation Loss: 813.4258\n", "==================================================\n", "\n", "Epoch 71/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1148.6038\n", " Batch 2/6, Loss: 1148.6038\n", " Batch 4/6, Loss: 1021.5333\n", " Batch 4/6, Loss: 1021.5333\n", " Batch 6/6, Loss: 1002.3242\n", "\n", "Training Loss: 980.4191\n", "Validation Loss: 746.3327\n", "==================================================\n", "\n", "Epoch 72/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1002.3242\n", "\n", "Training Loss: 980.4191\n", "Validation Loss: 746.3327\n", "==================================================\n", "\n", "Epoch 72/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1087.6707\n", " Batch 2/6, Loss: 1087.6707\n", " Batch 4/6, Loss: 305.9226\n", " Batch 4/6, Loss: 305.9226\n", " Batch 6/6, Loss: 1795.0642\n", "\n", "Training Loss: 1274.7856\n", "Validation Loss: 351.7238\n", "==================================================\n", "\n", "Epoch 73/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1795.0642\n", "\n", "Training Loss: 1274.7856\n", "Validation Loss: 351.7238\n", "==================================================\n", "\n", "Epoch 73/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 2404.6431\n", " Batch 2/6, Loss: 2404.6431\n", " Batch 4/6, Loss: 1157.0549\n", " Batch 4/6, Loss: 1157.0549\n", " Batch 6/6, Loss: 477.4741\n", "\n", "Training Loss: 1125.0124\n", "Validation Loss: 385.6382\n", "==================================================\n", "\n", "Epoch 74/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 477.4741\n", "\n", "Training Loss: 1125.0124\n", "Validation Loss: 385.6382\n", "==================================================\n", "\n", "Epoch 74/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1105.8162\n", " Batch 2/6, Loss: 1105.8162\n", " Batch 4/6, Loss: 390.5640\n", " Batch 4/6, Loss: 390.5640\n", " Batch 6/6, Loss: 1793.9602\n", "\n", "Training Loss: 1110.7176\n", "Validation Loss: 302.5034\n", "==================================================\n", "\n", "Epoch 75/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1793.9602\n", "\n", "Training Loss: 1110.7176\n", "Validation Loss: 302.5034\n", "==================================================\n", "\n", "Epoch 75/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1137.3442\n", " Batch 2/6, Loss: 1137.3442\n", " Batch 4/6, Loss: 756.0102\n", " Batch 4/6, Loss: 756.0102\n", " Batch 6/6, Loss: 664.8764\n", "\n", "Training Loss: 1113.0810\n", "Validation Loss: 446.4949\n", "==================================================\n", "\n", "Epoch 76/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 664.8764\n", "\n", "Training Loss: 1113.0810\n", "Validation Loss: 446.4949\n", "==================================================\n", "\n", "Epoch 76/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 2160.4382\n", " Batch 2/6, Loss: 2160.4382\n", " Batch 4/6, Loss: 1295.9885\n", " Batch 4/6, Loss: 1295.9885\n", " Batch 6/6, Loss: 300.2890\n", "\n", "Training Loss: 1310.1552\n", "Validation Loss: 1005.4635\n", "==================================================\n", "\n", "Epoch 77/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 300.2890\n", "\n", "Training Loss: 1310.1552\n", "Validation Loss: 1005.4635\n", "==================================================\n", "\n", "Epoch 77/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 2651.1912\n", " Batch 2/6, Loss: 2651.1912\n", " Batch 4/6, Loss: 1506.8727\n", " Batch 4/6, Loss: 1506.8727\n", " Batch 6/6, Loss: 1258.9523\n", "\n", "Training Loss: 1413.2891\n", "Validation Loss: 418.9783\n", "==================================================\n", "\n", "Epoch 78/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1258.9523\n", "\n", "Training Loss: 1413.2891\n", "Validation Loss: 418.9783\n", "==================================================\n", "\n", "Epoch 78/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1911.2028\n", " Batch 2/6, Loss: 1911.2028\n", " Batch 4/6, Loss: 192.0707\n", " Batch 4/6, Loss: 192.0707\n", " Batch 6/6, Loss: 2079.9006\n", "\n", "Training Loss: 1429.2989\n", "Validation Loss: 299.6776\n", "==================================================\n", "\n", "Epoch 79/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 2079.9006\n", "\n", "Training Loss: 1429.2989\n", "Validation Loss: 299.6776\n", "==================================================\n", "\n", "Epoch 79/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 469.2592\n", " Batch 2/6, Loss: 469.2592\n", " Batch 4/6, Loss: 841.7337\n", " Batch 4/6, Loss: 841.7337\n", " Batch 6/6, Loss: 417.4185\n", "\n", "Training Loss: 621.8783\n", "Validation Loss: 602.1896\n", "==================================================\n", "\n", "Epoch 80/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 417.4185\n", "\n", "Training Loss: 621.8783\n", "Validation Loss: 602.1896\n", "==================================================\n", "\n", "Epoch 80/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 774.4798\n", " Batch 2/6, Loss: 774.4798\n", " Batch 4/6, Loss: 2120.1550\n", " Batch 4/6, Loss: 2120.1550\n", " Batch 6/6, Loss: 703.3870\n", "\n", "Training Loss: 1458.7352\n", "Validation Loss: 401.0887\n", "==================================================\n", "\n", "Epoch 81/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 703.3870\n", "\n", "Training Loss: 1458.7352\n", "Validation Loss: 401.0887\n", "==================================================\n", "\n", "Epoch 81/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 700.1279\n", " Batch 2/6, Loss: 700.1279\n", " Batch 4/6, Loss: 978.2941\n", " Batch 4/6, Loss: 978.2941\n", " Batch 6/6, Loss: 93.4857\n", "\n", "Training Loss: 869.4923\n", "Validation Loss: 576.3536\n", "==================================================\n", "\n", "Epoch 82/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 93.4857\n", "\n", "Training Loss: 869.4923\n", "Validation Loss: 576.3536\n", "==================================================\n", "\n", "Epoch 82/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 271.9695\n", " Batch 2/6, Loss: 271.9695\n", " Batch 4/6, Loss: 524.4205\n", " Batch 4/6, Loss: 524.4205\n", " Batch 6/6, Loss: 391.3031\n", "\n", "Training Loss: 715.7422\n", "Validation Loss: 385.1625\n", "==================================================\n", "\n", "Epoch 83/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 391.3031\n", "\n", "Training Loss: 715.7422\n", "Validation Loss: 385.1625\n", "==================================================\n", "\n", "Epoch 83/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 2158.5562\n", " Batch 2/6, Loss: 2158.5562\n", " Batch 4/6, Loss: 926.0807\n", " Batch 4/6, Loss: 926.0807\n", " Batch 6/6, Loss: 1073.7682\n", "\n", "Training Loss: 1355.6703\n", "Validation Loss: 1026.5629\n", "==================================================\n", "\n", "Epoch 84/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1073.7682\n", "\n", "Training Loss: 1355.6703\n", "Validation Loss: 1026.5629\n", "==================================================\n", "\n", "Epoch 84/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 259.5579\n", " Batch 2/6, Loss: 259.5579\n", " Batch 4/6, Loss: 3026.2922\n", " Batch 4/6, Loss: 3026.2922\n", " Batch 6/6, Loss: 3677.9365\n", "\n", "Training Loss: 2666.0879\n", "Validation Loss: 381.7251\n", "==================================================\n", "\n", "Epoch 85/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 3677.9365\n", "\n", "Training Loss: 2666.0879\n", "Validation Loss: 381.7251\n", "==================================================\n", "\n", "Epoch 85/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 711.9403\n", " Batch 2/6, Loss: 711.9403\n", " Batch 4/6, Loss: 1345.9635\n", " Batch 4/6, Loss: 1345.9635\n", " Batch 6/6, Loss: 1704.4912\n", "\n", "Training Loss: 1567.6093\n", "Validation Loss: 2281.0577\n", "==================================================\n", "\n", "Epoch 86/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1704.4912\n", "\n", "Training Loss: 1567.6093\n", "Validation Loss: 2281.0577\n", "==================================================\n", "\n", "Epoch 86/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 584.7172\n", " Batch 2/6, Loss: 584.7172\n", " Batch 4/6, Loss: 1563.0305\n", " Batch 4/6, Loss: 1563.0305\n", " Batch 6/6, Loss: 533.0541\n", "\n", "Training Loss: 1041.2315\n", "Validation Loss: 493.2439\n", "==================================================\n", "\n", "Epoch 87/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 533.0541\n", "\n", "Training Loss: 1041.2315\n", "Validation Loss: 493.2439\n", "==================================================\n", "\n", "Epoch 87/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 2047.6639\n", " Batch 2/6, Loss: 2047.6639\n", " Batch 4/6, Loss: 264.0167\n", " Batch 4/6, Loss: 264.0167\n", " Batch 6/6, Loss: 701.7789\n", "\n", "Training Loss: 1241.5895\n", "Validation Loss: 1105.6261\n", "==================================================\n", "\n", "Epoch 88/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 701.7789\n", "\n", "Training Loss: 1241.5895\n", "Validation Loss: 1105.6261\n", "==================================================\n", "\n", "Epoch 88/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1226.7020\n", " Batch 2/6, Loss: 1226.7020\n", " Batch 4/6, Loss: 1822.4323\n", " Batch 4/6, Loss: 1822.4323\n", " Batch 6/6, Loss: 1521.5305\n", "\n", "Training Loss: 1270.7472\n", "Validation Loss: 642.0230\n", "==================================================\n", "\n", "Epoch 89/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1521.5305\n", "\n", "Training Loss: 1270.7472\n", "Validation Loss: 642.0230\n", "==================================================\n", "\n", "Epoch 89/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1037.7415\n", " Batch 2/6, Loss: 1037.7415\n", " Batch 4/6, Loss: 997.1461\n", " Batch 4/6, Loss: 997.1461\n", " Batch 6/6, Loss: 1074.6147\n", "\n", "Training Loss: 1057.4086\n", "Validation Loss: 658.3756\n", "==================================================\n", "\n", "Epoch 90/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1074.6147\n", "\n", "Training Loss: 1057.4086\n", "Validation Loss: 658.3756\n", "==================================================\n", "\n", "Epoch 90/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1422.3121\n", " Batch 2/6, Loss: 1422.3121\n", " Batch 4/6, Loss: 889.4645\n", " Batch 4/6, Loss: 889.4645\n", " Batch 6/6, Loss: 653.3372\n", "\n", "Training Loss: 950.8944\n", "Validation Loss: 600.2287\n", "==================================================\n", "\n", "Epoch 91/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 653.3372\n", "\n", "Training Loss: 950.8944\n", "Validation Loss: 600.2287\n", "==================================================\n", "\n", "Epoch 91/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 425.2661\n", " Batch 2/6, Loss: 425.2661\n", " Batch 4/6, Loss: 1312.3296\n", " Batch 4/6, Loss: 1312.3296\n", " Batch 6/6, Loss: 310.0301\n", "\n", "Training Loss: 1320.8423\n", "Validation Loss: 411.2174\n", "==================================================\n", "\n", "Epoch 92/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 310.0301\n", "\n", "Training Loss: 1320.8423\n", "Validation Loss: 411.2174\n", "==================================================\n", "\n", "Epoch 92/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1734.9235\n", " Batch 2/6, Loss: 1734.9235\n", " Batch 4/6, Loss: 585.0414\n", " Batch 4/6, Loss: 585.0414\n", " Batch 6/6, Loss: 262.4648\n", "\n", "Training Loss: 768.0501\n", "Validation Loss: 428.2428\n", "==================================================\n", "\n", "Epoch 93/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 262.4648\n", "\n", "Training Loss: 768.0501\n", "Validation Loss: 428.2428\n", "==================================================\n", "\n", "Epoch 93/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1084.6410\n", " Batch 2/6, Loss: 1084.6410\n", " Batch 4/6, Loss: 451.9109\n", " Batch 4/6, Loss: 451.9109\n", " Batch 6/6, Loss: 588.4257\n", "\n", "Training Loss: 705.5978\n", "Validation Loss: 314.2621\n", "==================================================\n", "\n", "Epoch 94/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 588.4257\n", "\n", "Training Loss: 705.5978\n", "Validation Loss: 314.2621\n", "==================================================\n", "\n", "Epoch 94/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 971.0799\n", " Batch 2/6, Loss: 971.0799\n", " Batch 4/6, Loss: 1301.5695\n", " Batch 4/6, Loss: 1301.5695\n", " Batch 6/6, Loss: 1287.7241\n", "\n", "Training Loss: 979.1417\n", "Validation Loss: 725.8828\n", "==================================================\n", "\n", "Epoch 95/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1287.7241\n", "\n", "Training Loss: 979.1417\n", "Validation Loss: 725.8828\n", "==================================================\n", "\n", "Epoch 95/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1223.4810\n", " Batch 2/6, Loss: 1223.4810\n", " Batch 4/6, Loss: 417.0378\n", " Batch 4/6, Loss: 417.0378\n", " Batch 6/6, Loss: 1718.1934\n", "\n", "Training Loss: 915.8945\n", "Validation Loss: 1030.1551\n", "==================================================\n", "\n", "Epoch 96/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1718.1934\n", "\n", "Training Loss: 915.8945\n", "Validation Loss: 1030.1551\n", "==================================================\n", "\n", "Epoch 96/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 499.9333\n", " Batch 2/6, Loss: 499.9333\n", " Batch 4/6, Loss: 372.2247\n", " Batch 4/6, Loss: 372.2247\n", " Batch 6/6, Loss: 1453.7803\n", "\n", "Training Loss: 841.7416\n", "Validation Loss: 401.4770\n", "==================================================\n", "\n", "Epoch 97/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1453.7803\n", "\n", "Training Loss: 841.7416\n", "Validation Loss: 401.4770\n", "==================================================\n", "\n", "Epoch 97/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 197.7269\n", " Batch 2/6, Loss: 197.7269\n", " Batch 4/6, Loss: 1313.4368\n", " Batch 4/6, Loss: 1313.4368\n", " Batch 6/6, Loss: 1749.7418\n", "\n", "Training Loss: 1035.9815\n", "Validation Loss: 355.3339\n", "==================================================\n", "\n", "Epoch 98/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1749.7418\n", "\n", "Training Loss: 1035.9815\n", "Validation Loss: 355.3339\n", "==================================================\n", "\n", "Epoch 98/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 330.8758\n", " Batch 2/6, Loss: 330.8758\n", " Batch 4/6, Loss: 943.8824\n", " Batch 4/6, Loss: 943.8824\n", " Batch 6/6, Loss: 971.1874\n", "\n", "Training Loss: 908.5701\n", "Validation Loss: 485.2701\n", "==================================================\n", "\n", "Epoch 99/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 971.1874\n", "\n", "Training Loss: 908.5701\n", "Validation Loss: 485.2701\n", "==================================================\n", "\n", "Epoch 99/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 2611.0654\n", " Batch 2/6, Loss: 2611.0654\n", " Batch 4/6, Loss: 1940.4594\n", " Batch 4/6, Loss: 1940.4594\n", " Batch 6/6, Loss: 1291.4149\n", "\n", "Training Loss: 1527.1883\n", "Validation Loss: 1819.1451\n", "==================================================\n", "\n", "Epoch 100/100\n", "----------------------------------------\n", " Batch 6/6, Loss: 1291.4149\n", "\n", "Training Loss: 1527.1883\n", "Validation Loss: 1819.1451\n", "==================================================\n", "\n", "Epoch 100/100\n", "----------------------------------------\n", " Batch 2/6, Loss: 1946.8259\n", " Batch 2/6, Loss: 1946.8259\n", " Batch 4/6, Loss: 2668.1829\n", " Batch 4/6, Loss: 2668.1829\n", " Batch 6/6, Loss: 1472.7712\n", "\n", "Training Loss: 1695.5324\n", "Validation Loss: 1051.6064\n", "==================================================\n", " Batch 6/6, Loss: 1472.7712\n", "\n", "Training Loss: 1695.5324\n", "Validation Loss: 1051.6064\n", "==================================================\n" ] } ], "source": [ "# Create data loaders with proper error handling\n", "try:\n", " # Print dataset info for debugging\n", " print(\"Loading cleaned dataset...\")\n", " dataset = BananaDataset(\n", " csv_file='training_data/clean_dataset.csv',\n", " img_dir='training_data',\n", " transform=train_transform\n", " )\n", " \n", " # Split into train and validation sets\n", " train_size = int(0.8 * len(dataset))\n", " val_size = len(dataset) - train_size\n", " train_dataset, val_dataset = torch.utils.data.random_split(dataset, [train_size, val_size])\n", " \n", " print(f\"\\nSplit dataset into {train_size} training and {val_size} validation samples\")\n", " \n", " train_loader = torch.utils.data.DataLoader(\n", " train_dataset, \n", " batch_size=4, # Reduced batch size\n", " shuffle=True,\n", " num_workers=0 # Disabled multiprocessing for debugging\n", " )\n", " \n", " val_loader = torch.utils.data.DataLoader(\n", " val_dataset, \n", " batch_size=4, # Reduced batch size\n", " shuffle=False,\n", " num_workers=0 # Disabled multiprocessing for debugging\n", " )\n", " \n", " # Initialize model, loss function, and optimizer\n", " device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')\n", " model = BananaNet().to(device)\n", " criterion = nn.MSELoss()\n", " optimizer = optim.Adam(model.parameters(), lr=0.001)\n", " \n", " # Training loop\n", " num_epochs = 200 # Increased from 100 to 200 for further training\n", " best_val_loss = float('inf')\n", " \n", " print(f\"\\nTraining on {device}\")\n", " print(\"=\" * 50)\n", " \n", " for epoch in range(num_epochs):\n", " print(f\"\\nEpoch {epoch+1}/{num_epochs}\")\n", " print(\"-\" * 40)\n", " \n", " # Training phase\n", " model.train()\n", " train_loss = 0.0\n", " for batch_idx, (images, labels) in enumerate(train_loader):\n", " images, labels = images.to(device), labels.to(device)\n", " \n", " optimizer.zero_grad()\n", " outputs = model(images)\n", " loss = criterion(outputs, labels)\n", " loss.backward()\n", " optimizer.step()\n", " \n", " train_loss += loss.item()\n", " \n", " if (batch_idx + 1) % 2 == 0:\n", " print(f\" Batch {batch_idx + 1}/{len(train_loader)}, Loss: {loss.item():.4f}\")\n", " \n", " train_loss = train_loss / len(train_loader)\n", " \n", " # Validation phase\n", " model.eval()\n", " val_loss = 0.0\n", " with torch.no_grad():\n", " for images, labels in val_loader:\n", " images, labels = images.to(device), labels.to(device)\n", " outputs = model(images)\n", " loss = criterion(outputs, labels)\n", " val_loss += loss.item()\n", " \n", " val_loss = val_loss / len(val_loader)\n", " \n", " print(f\"\\nTraining Loss: {train_loss:.4f}\")\n", " print(f\"Validation Loss: {val_loss:.4f}\")\n", " \n", " # Save best model\n", " if val_loss < best_val_loss:\n", " best_val_loss = val_loss\n", " torch.save(model.state_dict(), 'best_model.pth')\n", " print(\"Saved new best model!\")\n", " \n", " print(\"=\" * 50)\n", " \n", "except Exception as e:\n", " print(f\"Error during training: {str(e)}\")\n", " import traceback\n", " traceback.print_exc()" ] }, { "cell_type": "code", "execution_count": null, "id": "1b488117", "metadata": {}, "outputs": [], "source": [ "# Function to test the model on a single image\n", "def test_prediction(model, image_path, device):\n", " # Load and preprocess image\n", " transform = transforms.Compose([\n", " transforms.Resize((224, 224)),\n", " transforms.ToTensor(),\n", " transforms.Normalize(mean=[0.485, 0.456, 0.406],\n", " std=[0.229, 0.224, 0.225])\n", " ])\n", " \n", " image = Image.open(image_path).convert('RGB')\n", " image_tensor = transform(image).unsqueeze(0).to(device)\n", " \n", " # Get predictions\n", " model.eval()\n", " with torch.no_grad():\n", " predictions = model(image_tensor)\n", " \n", " # Denormalize predictions\n", " seed_count = int(round(predictions[0][0].item() * 500)) # Denormalize seeds\n", " curvature = round(predictions[0][1].item() * 360, 1) # Denormalize degrees\n", " \n", " return {\n", " 'seeds': max(0, seed_count),\n", " 'curvature': max(0, min(360, curvature))\n", " }\n", "\n", "# Test the model on a sample image\n", "# Note: Replace 'sample_image.jpg' with an actual image path\n", "print(\"Model training complete! You can now use test_prediction() to make predictions on new images.\")" ] }, { "cell_type": "markdown", "id": "b6b3dcf8", "metadata": {}, "source": [ "# Data Preparation\n", "\n", "The model will use the following input features to make predictions:\n", "1. Length (cm)\n", "2. Width (cm)\n", "3. Weight (g)\n", "4. Ripeness level (1-5)\n", "5. Color (encoded)\n", "\n", "The model will predict:\n", "1. Number of seeds\n", "2. Curvature (degrees)" ] }, { "cell_type": "code", "execution_count": 6, "id": "5c03fa5e", "metadata": {}, "outputs": [], "source": [ "# Create sample data structure (to be replaced with real data later)\n", "def create_sample_data(n_samples=100):\n", " np.random.seed(42)\n", " \n", " # Generate synthetic features\n", " length = np.random.normal(15, 2, n_samples) # Average banana length 15cm\n", " width = np.random.normal(3, 0.5, n_samples) # Average banana width 3cm\n", " weight = np.random.normal(120, 20, n_samples) # Average banana weight 120g\n", " ripeness = np.random.randint(1, 6, n_samples) # Ripeness level 1-5\n", " color = np.random.randint(1, 4, n_samples) # Color encoded as 1=green, 2=yellow, 3=brown\n", " \n", " # Create feature matrix\n", " X = np.column_stack([length, width, weight, ripeness, color])\n", " \n", " # Generate targets\n", " # 1. Number of seeds\n", " seeds = (0.5 * length + 0.3 * width + 0.2 * weight/100 + 0.1 * ripeness + \n", " 0.1 * color).astype(int)\n", " seeds = np.clip(seeds, 0, 20) # Limit number of seeds between 0 and 20\n", " \n", " # 2. Curvature (degrees) - influenced by length and ripeness\n", " curvature = (45 + 0.5 * (length - 15) + 2 * (ripeness - 3) + \n", " np.random.normal(0, 5, n_samples))\n", " curvature = np.clip(curvature, 20, 70) # Limit curvature between 20 and 70 degrees\n", " \n", " # Create DataFrame for features\n", " columns = ['length', 'width', 'weight', 'ripeness', 'color']\n", " df = pd.DataFrame(X, columns=columns)\n", " \n", " # Stack targets into a single array\n", " y = np.column_stack([seeds, curvature])\n", " \n", " return df, y\n", "\n", "# Create sample dataset\n", "X_data, y_data = create_sample_data()" ] }, { "cell_type": "code", "execution_count": 7, "id": "14104b31", "metadata": {}, "outputs": [ { "data": { "text/html": [ "
RandomForestRegressor(max_depth=10, random_state=42)In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
| \n", " | n_estimators | \n", "100 | \n", "
| \n", " | criterion | \n", "'squared_error' | \n", "
| \n", " | max_depth | \n", "10 | \n", "
| \n", " | min_samples_split | \n", "2 | \n", "
| \n", " | min_samples_leaf | \n", "1 | \n", "
| \n", " | min_weight_fraction_leaf | \n", "0.0 | \n", "
| \n", " | max_features | \n", "1.0 | \n", "
| \n", " | max_leaf_nodes | \n", "None | \n", "
| \n", " | min_impurity_decrease | \n", "0.0 | \n", "
| \n", " | bootstrap | \n", "True | \n", "
| \n", " | oob_score | \n", "False | \n", "
| \n", " | n_jobs | \n", "None | \n", "
| \n", " | random_state | \n", "42 | \n", "
| \n", " | verbose | \n", "0 | \n", "
| \n", " | warm_start | \n", "False | \n", "
| \n", " | ccp_alpha | \n", "0.0 | \n", "
| \n", " | max_samples | \n", "None | \n", "
| \n", " | monotonic_cst | \n", "None | \n", "