ResNet50 Fine-Tuned – CIFAR-10

πŸ“Œ Overview

ResNet50 Fine-Tuned for CIFAR-10 is a high-performance image classification model built by adapting an ImageNet-pretrained ResNet50 to the CIFAR-10 dataset.

Instead of training from scratch, this model leverages pretrained visual representations and selectively fine-tunes higher layers, achieving strong accuracy with faster convergence and improved generalization.

This model represents a production-oriented baseline for small-image classification tasks.


πŸ— Architecture Summary

Input (224Γ—224)
└─ Pretrained ResNet50 Backbone
β”œβ”€ Conv β†’ BN β†’ ReLU
β”œβ”€ Residual Blocks (Frozen)
β”œβ”€ Residual Blocks (Fine-tuned – layer4)
└─ Global Average Pool
└─ Custom FC Head
β”œβ”€ Linear (2048 β†’ 512)
β”œβ”€ ReLU
β”œβ”€ Dropout (0.5)
└─ Linear (512 β†’ 10)

πŸ“Š Performance

Metric Value
Validation Accuracy 92.46%
Validation Loss 0.257
Training Epochs 40
Optimizer AdamW
Learning Rate 1e-4
Weight Decay 1e-4

This model outperforms all scratch-trained models by leveraging pretrained ImageNet features.


πŸ”¬ Training Details

  • Dataset: CIFAR-10
  • Input resolution: 224Γ—224
  • Normalization: ImageNet statistics
  • Loss: CrossEntropyLoss
  • Scheduler: Cosine Annealing
  • Batch size: 64
  • Hardware: NVIDIA GTX 1650

πŸš€ Usage

Load model

import torch
from model import ResNet50FineTune

model = ResNet50FineTune(num_classes=10)

ckpt = torch.load("resnet50_finetune_cifar.pt", map_location="cpu")
model.load_state_dict(ckpt["state_dict"])

model.eval()

Input preprocessing

from torchvision import transforms

transform = transforms.Compose([
    transforms.Resize(256),
    transforms.CenterCrop(224),
    transforms.ToTensor(),
    transforms.Normalize(
        mean=(0.485, 0.456, 0.406),
        std=(0.229, 0.224, 0.225)
    )
])

πŸ§ͺ Intended Use

  • Production-ready CIFAR-10 classification
  • Transfer learning benchmarks
  • Fine-tuning experiments
  • Educational demonstrations of pretrained CNN adaptation

⚠ Limitations

  • Input images must be resized to 224Γ—224
  • Depends on ImageNet-pretrained weights
  • Less suitable for architecture research compared to scratch models

πŸ” Comparison with Scratch Model

Model Training Val Acc
PreNormResNet v4 From scratch 89.29%
ResNet50 Fine-Tuned Transfer learning 92.46%

Pretraining shifts the learning regime and provides a strong inductive bias, especially effective for small datasets like CIFAR-10.


πŸ‘€ Author

AnjanSB

Built with PyTorch, MLflow, DVC, and Dagshub

πŸ“Š Experiment Repo: https://dagshub.com/AnjanSB/cifar-image-classification-modeling-dlops

πŸš€ Live Demo (Custom Scratch Model): https://huggingface.co/spaces/AnjanSB/cifar10-prenorm-resnet-demo


🏁 Final Note

This model represents a real-world deployment strategy, while the custom PreNormResNet v4 highlights architecture engineering and research depth. Together, they form a complete and well-rounded ML portfolio.

Downloads last month
5
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Dataset used to train AnjanSB/finetune-resnet-cifar10

Space using AnjanSB/finetune-resnet-cifar10 1