ResNet-18 Perforated

Model Description

This is a perforated ResNet-18 model, one enhanced with dendritic optimization, a novel technique that increases accuracy with significantly improved parameter efficiency. The model was pretrained on ImageNet and can be used achieve ResNet-34 level performance on Flowers-101 while using a fraction of the parameters, making it ideal for transfer learning applications.

What is Dendritic Optimization?

Dendritic optimization (referred to as "perforation") enhances neural networks by adding specialized dendrite nodes to individual neurons. Each dendrite node receives the same inputs as its parent neuron but provides a dedicated output channel exclusively to that neuron. This allows neurons to make more sophisticated decisions about the features they encode, similar to how biological dendrites help neurons process information.

Think of it as giving each neuron its own set of specialized advisors that help it better understand the features it's responsible for detecting.

Model Details

  • Architecture: ResNet-18 with perforated pre-FC layer
  • Parameters: 12.3M (compared to 11.7M for standard ResNet-18, 21.8M for ResNet-34)
  • Dendrites: 3 dendrite nodes per neuron in the pre-FC layer
  • Pre-training: ImageNet-1k
  • Framework: PyTorch
  • License: Apache 2.0

Architecture Flow

Main ResNet-18 backbone
Pre-FC Layer (Perforated FC layer with 3 dendrites)
FC Layer (replaceable for transfer learning)

The uploaded FC layer has randomized weights, this model is only for use in transfer learning where the FC layer should be replaced.

Performance

Parameter Efficiency on ImageNet

The key advantage of dendritic optimization is dramatic improvement in parameter efficiency. When considering the trade-off of moving from ResNet-18 to ResNet-34:

Model Parameters ImageNet Accuracy Percentage Gain per M Params
ResNet-18 11.7M 69.76% - (baseline)
ResNet-34 21.8M 73.30% 0.35
ResNet-18 + 1 Dendrite 12.2M 71.09% 2.54
ResNet-18 + 2 Dendrites 12.7M 71.60% 1.75
ResNet-18 + 3 Dendrites 13.3M 71.95% 1.38

Key Insight: Adding dendrites to ResNet-18 provides 4-7x better accuracy improvement per additional parameter compared to upgrading to ResNet-34. These numbers are acheived without the pre-fc layer where the output FC layer is perforated. That model can be uploaded as well if requested.

Transfer Learning Performance (Flowers-101)

This model excels at transfer learning tasks:

Model Parameters Flowers-101 Accuracy Oxford Pets Accuracy Food-102 Accuracy
ResNet-18 11.7M 89.81% 90.8% 81.7%
ResNet-34 21.8M 90.691% 92.6% 83.9%
ResNet-18 Perforated (this model) 12.3M 90.463% 91.4% 82.1%

Result: For Flowers-101, Pets, and Food-102 this model closes 74%, 33% and 18%, respecively, of the accuracy gap with only 6% of the parameter gap between ResNet-18 and ResNet-34.

Full results of all experiments performed can be found here.

Latency

The original model started from the torchvision ResNet rather than the microsoft/resnet. Latency comparisons start there as well to ensure fair results. Test was run processing a single image at a time on a AMD Ryzen Threadripper PRO 9975WX CPU over the full Flowers-101 dataset.

Model Time Per Image Throughput
ResNet-18 4.04ms 247.46 FPS
ResNet-34 7.48ms 133.74 FPS
ResNet-18 Perforated 4.37ms 228.63 FPS

As expected, latency is proportinally similar to parameter count between the models.

Usage

Installation

First, install the PerforatedAI library:

pip install perforatedai

Loading the Model

import torch
import torchvision
from perforatedai import utils_perforatedai as UPA
from perforatedai import library_perforatedai as LPA

# Create base model architecture
base_model = torchvision.models.get_model('resnet18', weights=None, num_classes=1000)

# Convert to perforated architecture
model = LPA.ResNetPAIPreFC(base_model)

# Load pretrained weights from HuggingFace
model = UPA.from_hf_pretrained(model, 'perforated-ai/resnet-18-perforated')

Using for Transfer Learning

For transfer learning on your own dataset, replace the final FC layer:

import torch.nn as nn

# Assuming you have num_classes for your task
num_classes = 102  # Example: Flowers-102

# Replace the final FC layer
model.fc = nn.Linear(model.fc.in_features, num_classes)

Preprocessing

Use standard preprocessing that you would use with any ResNet-18 such as the following for ImageNet:

from torchvision import transforms

transform = transforms.Compose([
    transforms.Resize(256),
    transforms.CenterCrop(224),
    transforms.ToTensor(),
    transforms.Normalize(mean=[0.485, 0.456, 0.406], 
                       std=[0.229, 0.224, 0.225])
])

Intended Use

Primary Use Cases

  • Transfer learning for image classification tasks
  • Resource-constrained environments where parameter efficiency is critical
  • Applications requiring ResNet-34 performance with ResNet-18 computational costs

Recommendations

This model is particularly effective when you need strong feature representations without the parameter overhead of larger models. The perforated pre-FC layer has learned rich representations from ImageNet that transfer well to downstream tasks.

Training Details

Training Data

  • Dataset: ImageNet-1k (1.28M training images, 1000 classes)
  • Perforation Target: Pre-FC layer (for transfer learning optimization)

Training Procedure

Detailed training code and procedures are available in our GitHub repository:

  • Main Repository: link
  • Training Scripts: link

Evaluation

Evaluation Dataset

The model was evaluated on Flowers-101 dataset for transfer learning performance.

Evaluation Code: link

Citation

If you use this model in your research, please cite:

@article{perforatedai2025,
  title={Perforated Backpropagation: A Neuroscience Inspired Extension to Artificial Neural Networks},
  author={[Rorry Brenner, and Laurent Itti]},
  journal={arXiv preprint arXiv:2501.18018},
  year={2025},
  url={https://arxiv.org/abs/2501.18018}
}

Additional Resources

Model Card Authors

PerforatedAI Team

Model Card Contact

For questions or issues, please open an issue on our GitHub repository.

Downloads last month
198
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for perforated-ai/resnet-18-perforated

Finetuned
(51)
this model

Paper for perforated-ai/resnet-18-perforated

Evaluation results