InternImage: Optimized for Qualcomm Devices

InternImage employs DCNv3 as its core operator to equips the model with dynamic and effective receptive fields required for downstream tasks like object detection and segmentation, while enabling adaptive spatial aggregation.

This repository contains pre-exported model files optimized for Qualcomm® devices. You can use the Qualcomm® AI Hub Models library to export with custom configurations. More details on model performance across various devices, can be found here.

Qualcomm AI Hub Models uses Qualcomm AI Hub Workbench to compile, profile, and evaluate this model. Sign up to run these models on a hosted Qualcomm® device.

Getting Started

There are two ways to deploy this model on your device:

Option 1: Download Pre-Exported Models

Below are pre-exported model assets ready for deployment.

Runtime Precision Chipset SDK Versions Download
PRECOMPILED_QNN_ONNX float Snapdragon® X Elite QAIRT 2.37, ONNX Runtime 1.23.0 Download
PRECOMPILED_QNN_ONNX float Snapdragon® 8 Gen 3 Mobile QAIRT 2.37, ONNX Runtime 1.23.0 Download
PRECOMPILED_QNN_ONNX float Qualcomm® QCS8550 (Proxy) QAIRT 2.37, ONNX Runtime 1.23.0 Download
PRECOMPILED_QNN_ONNX float Snapdragon® 8 Elite For Galaxy Mobile QAIRT 2.37, ONNX Runtime 1.23.0 Download
PRECOMPILED_QNN_ONNX float Snapdragon® 8 Elite Gen 5 Mobile QAIRT 2.37, ONNX Runtime 1.23.0 Download
PRECOMPILED_QNN_ONNX float Qualcomm® QCS9075 QAIRT 2.37, ONNX Runtime 1.23.0 Download
PRECOMPILED_QNN_ONNX w8a8 Snapdragon® X Elite QAIRT 2.37, ONNX Runtime 1.23.0 Download
PRECOMPILED_QNN_ONNX w8a8 Snapdragon® 8 Gen 3 Mobile QAIRT 2.37, ONNX Runtime 1.23.0 Download
PRECOMPILED_QNN_ONNX w8a8 Qualcomm® QCS6490 QAIRT 2.37, ONNX Runtime 1.23.0 Download
PRECOMPILED_QNN_ONNX w8a8 Qualcomm® QCS8550 (Proxy) QAIRT 2.37, ONNX Runtime 1.23.0 Download
PRECOMPILED_QNN_ONNX w8a8 Snapdragon® 8 Elite For Galaxy Mobile QAIRT 2.37, ONNX Runtime 1.23.0 Download
PRECOMPILED_QNN_ONNX w8a8 Snapdragon® 7 Gen 4 Mobile QAIRT 2.37, ONNX Runtime 1.23.0 Download
PRECOMPILED_QNN_ONNX w8a8 Snapdragon® 8 Elite Gen 5 Mobile QAIRT 2.37, ONNX Runtime 1.23.0 Download
PRECOMPILED_QNN_ONNX w8a8 Qualcomm® QCM6690 QAIRT 2.37, ONNX Runtime 1.23.0 Download
PRECOMPILED_QNN_ONNX w8a8 Qualcomm® QCS9075 QAIRT 2.37, ONNX Runtime 1.23.0 Download
QNN_CONTEXT_BINARY float Snapdragon® X Elite QAIRT 2.42 Download
QNN_CONTEXT_BINARY float Snapdragon® 8 Gen 3 Mobile QAIRT 2.42 Download
QNN_CONTEXT_BINARY float Qualcomm® QCS8275 (Proxy) QAIRT 2.42 Download
QNN_CONTEXT_BINARY float Qualcomm® QCS8550 (Proxy) QAIRT 2.42 Download
QNN_CONTEXT_BINARY float Qualcomm® SA8775P QAIRT 2.42 Download
QNN_CONTEXT_BINARY float Snapdragon® 8 Elite For Galaxy Mobile QAIRT 2.42 Download
QNN_CONTEXT_BINARY float Snapdragon® 8 Elite Gen 5 Mobile QAIRT 2.42 Download
QNN_CONTEXT_BINARY float Qualcomm® SA7255P QAIRT 2.42 Download
QNN_CONTEXT_BINARY float Qualcomm® SA8295P QAIRT 2.42 Download
QNN_CONTEXT_BINARY float Qualcomm® QCS9075 QAIRT 2.42 Download
QNN_CONTEXT_BINARY float Qualcomm® QCS8450 (Proxy) QAIRT 2.42 Download
QNN_CONTEXT_BINARY w8a8 Snapdragon® X Elite QAIRT 2.42 Download
QNN_CONTEXT_BINARY w8a8 Snapdragon® 8 Gen 3 Mobile QAIRT 2.42 Download
QNN_CONTEXT_BINARY w8a8 Qualcomm® QCS6490 QAIRT 2.42 Download
QNN_CONTEXT_BINARY w8a8 Qualcomm® QCS8275 (Proxy) QAIRT 2.42 Download
QNN_CONTEXT_BINARY w8a8 Qualcomm® QCS8550 (Proxy) QAIRT 2.42 Download
QNN_CONTEXT_BINARY w8a8 Qualcomm® SA8775P QAIRT 2.42 Download
QNN_CONTEXT_BINARY w8a8 Snapdragon® 8 Elite For Galaxy Mobile QAIRT 2.42 Download
QNN_CONTEXT_BINARY w8a8 Snapdragon® 7 Gen 4 Mobile QAIRT 2.42 Download
QNN_CONTEXT_BINARY w8a8 Snapdragon® 8 Elite Gen 5 Mobile QAIRT 2.42 Download
QNN_CONTEXT_BINARY w8a8 Qualcomm® SA7255P QAIRT 2.42 Download
QNN_CONTEXT_BINARY w8a8 Qualcomm® SA8295P QAIRT 2.42 Download
QNN_CONTEXT_BINARY w8a8 Qualcomm® QCM6690 QAIRT 2.42 Download
QNN_CONTEXT_BINARY w8a8 Qualcomm® QCS9075 QAIRT 2.42 Download
QNN_CONTEXT_BINARY w8a8 Qualcomm® QCS8450 (Proxy) QAIRT 2.42 Download

For more device-specific assets and performance metrics, visit InternImage on Qualcomm® AI Hub.

Option 2: Export with Custom Configurations

Use the Qualcomm® AI Hub Models Python library to compile and export the model with your own:

  • Custom weights (e.g., fine-tuned checkpoints)
  • Custom input shapes
  • Target device and runtime configurations

This option is ideal if you need to customize the model beyond the default configuration provided here.

See our repository for InternImage on GitHub for usage instructions.

Model Details

Model Type: Model_use_case.image_classification

Model Stats:

  • Model checkpoint: internimage_t_1k_224
  • Input resolution: 1x3x224x224
  • Number of parameters: 30.6M
  • Model size (float): 117 MB

Performance Summary

Model Runtime Precision Chipset Inference Time (ms) Peak Memory Range (MB) Primary Compute Unit
InternImage PRECOMPILED_QNN_ONNX float Snapdragon® X Elite 52.506 ms 66 - 66 MB NPU
InternImage PRECOMPILED_QNN_ONNX float Snapdragon® 8 Gen 3 Mobile 34.782 ms 3 - 10 MB NPU
InternImage PRECOMPILED_QNN_ONNX float Qualcomm® QCS8550 (Proxy) 50.286 ms 0 - 81 MB NPU
InternImage PRECOMPILED_QNN_ONNX float Qualcomm® QCS9075 52.317 ms 0 - 4 MB NPU
InternImage PRECOMPILED_QNN_ONNX float Snapdragon® 8 Elite For Galaxy Mobile 26.287 ms 1 - 9 MB NPU
InternImage PRECOMPILED_QNN_ONNX float Snapdragon® 8 Elite Gen 5 Mobile 20.279 ms 1 - 11 MB NPU
InternImage QNN_CONTEXT_BINARY float Snapdragon® X Elite 52.126 ms 1 - 1 MB NPU
InternImage QNN_CONTEXT_BINARY float Snapdragon® 8 Gen 3 Mobile 35.001 ms 1 - 8 MB NPU
InternImage QNN_CONTEXT_BINARY float Qualcomm® QCS8275 (Proxy) 95.386 ms 1 - 9 MB NPU
InternImage QNN_CONTEXT_BINARY float Qualcomm® QCS8550 (Proxy) 50.041 ms 1 - 2 MB NPU
InternImage QNN_CONTEXT_BINARY float Qualcomm® QCS9075 51.556 ms 1 - 3 MB NPU
InternImage QNN_CONTEXT_BINARY float Qualcomm® QCS8450 (Proxy) 59.879 ms 1 - 10 MB NPU
InternImage QNN_CONTEXT_BINARY float Snapdragon® 8 Elite For Galaxy Mobile 25.489 ms 1 - 15 MB NPU
InternImage QNN_CONTEXT_BINARY float Snapdragon® 8 Elite Gen 5 Mobile 19.927 ms 1 - 11 MB NPU

License

  • The license for the original implementation of InternImage can be found here.

Community

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support