v0.50.1
Browse filesSee https://github.com/qualcomm/ai-hub-models/releases/v0.50.1 for changelog.
- README.md +9 -9
- release_assets.json +1 -1
README.md
CHANGED
|
@@ -15,7 +15,7 @@ pipeline_tag: image-segmentation
|
|
| 15 |
DeepLabXception is a semantic segmentation model supporting multiple backbones like ResNet-101 and Xception, with flexible dataset compatibility including COCO, VOC, and Cityscapes.
|
| 16 |
|
| 17 |
This is based on the implementation of DeepLabXception found [here](https://github.com/LikeLy-Journey/SegmenTron).
|
| 18 |
-
This repository contains pre-exported model files optimized for Qualcomm® devices. You can use the [Qualcomm® AI Hub Models](https://github.com/qualcomm/ai-hub-models/blob/main/qai_hub_models/models/deeplab_xception) library to export with custom configurations. More details on model performance across various devices, can be found [here](#performance-summary).
|
| 19 |
|
| 20 |
Qualcomm AI Hub Models uses [Qualcomm AI Hub Workbench](https://workbench.aihub.qualcomm.com) to compile, profile, and evaluate this model. [Sign up](https://myaccount.qualcomm.com/signup) to run these models on a hosted Qualcomm® device.
|
| 21 |
|
|
@@ -28,26 +28,26 @@ Below are pre-exported model assets ready for deployment.
|
|
| 28 |
|
| 29 |
| Runtime | Precision | Chipset | SDK Versions | Download |
|
| 30 |
|---|---|---|---|---|
|
| 31 |
-
| ONNX | float | Universal | QAIRT 2.42, ONNX Runtime 1.24.1 | [Download](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/deeplab_xception/releases/v0.50.
|
| 32 |
-
| ONNX | w8a8 | Universal | QAIRT 2.42, ONNX Runtime 1.24.1 | [Download](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/deeplab_xception/releases/v0.50.
|
| 33 |
-
| QNN_DLC | float | Universal | QAIRT 2.43 | [Download](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/deeplab_xception/releases/v0.50.
|
| 34 |
-
| QNN_DLC | w8a8 | Universal | QAIRT 2.43 | [Download](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/deeplab_xception/releases/v0.50.
|
| 35 |
-
| TFLITE | float | Universal | QAIRT 2.43, TFLite 2.17.0 | [Download](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/deeplab_xception/releases/v0.50.
|
| 36 |
-
| TFLITE | w8a8 | Universal | QAIRT 2.43, TFLite 2.17.0 | [Download](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/deeplab_xception/releases/v0.50.
|
| 37 |
|
| 38 |
For more device-specific assets and performance metrics, visit **[DeepLabXception on Qualcomm® AI Hub](https://aihub.qualcomm.com/models/deeplab_xception)**.
|
| 39 |
|
| 40 |
|
| 41 |
### Option 2: Export with Custom Configurations
|
| 42 |
|
| 43 |
-
Use the [Qualcomm® AI Hub Models](https://github.com/qualcomm/ai-hub-models/blob/main/qai_hub_models/models/deeplab_xception) Python library to compile and export the model with your own:
|
| 44 |
- Custom weights (e.g., fine-tuned checkpoints)
|
| 45 |
- Custom input shapes
|
| 46 |
- Target device and runtime configurations
|
| 47 |
|
| 48 |
This option is ideal if you need to customize the model beyond the default configuration provided here.
|
| 49 |
|
| 50 |
-
See our repository for [DeepLabXception on GitHub](https://github.com/qualcomm/ai-hub-models/blob/main/qai_hub_models/models/deeplab_xception) for usage instructions.
|
| 51 |
|
| 52 |
## Model Details
|
| 53 |
|
|
|
|
| 15 |
DeepLabXception is a semantic segmentation model supporting multiple backbones like ResNet-101 and Xception, with flexible dataset compatibility including COCO, VOC, and Cityscapes.
|
| 16 |
|
| 17 |
This is based on the implementation of DeepLabXception found [here](https://github.com/LikeLy-Journey/SegmenTron).
|
| 18 |
+
This repository contains pre-exported model files optimized for Qualcomm® devices. You can use the [Qualcomm® AI Hub Models](https://github.com/qualcomm/ai-hub-models/blob/main/src/qai_hub_models/models/deeplab_xception) library to export with custom configurations. More details on model performance across various devices, can be found [here](#performance-summary).
|
| 19 |
|
| 20 |
Qualcomm AI Hub Models uses [Qualcomm AI Hub Workbench](https://workbench.aihub.qualcomm.com) to compile, profile, and evaluate this model. [Sign up](https://myaccount.qualcomm.com/signup) to run these models on a hosted Qualcomm® device.
|
| 21 |
|
|
|
|
| 28 |
|
| 29 |
| Runtime | Precision | Chipset | SDK Versions | Download |
|
| 30 |
|---|---|---|---|---|
|
| 31 |
+
| ONNX | float | Universal | QAIRT 2.42, ONNX Runtime 1.24.1 | [Download](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/deeplab_xception/releases/v0.50.1/deeplab_xception-onnx-float.zip)
|
| 32 |
+
| ONNX | w8a8 | Universal | QAIRT 2.42, ONNX Runtime 1.24.1 | [Download](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/deeplab_xception/releases/v0.50.1/deeplab_xception-onnx-w8a8.zip)
|
| 33 |
+
| QNN_DLC | float | Universal | QAIRT 2.43 | [Download](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/deeplab_xception/releases/v0.50.1/deeplab_xception-qnn_dlc-float.zip)
|
| 34 |
+
| QNN_DLC | w8a8 | Universal | QAIRT 2.43 | [Download](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/deeplab_xception/releases/v0.50.1/deeplab_xception-qnn_dlc-w8a8.zip)
|
| 35 |
+
| TFLITE | float | Universal | QAIRT 2.43, TFLite 2.17.0 | [Download](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/deeplab_xception/releases/v0.50.1/deeplab_xception-tflite-float.zip)
|
| 36 |
+
| TFLITE | w8a8 | Universal | QAIRT 2.43, TFLite 2.17.0 | [Download](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/deeplab_xception/releases/v0.50.1/deeplab_xception-tflite-w8a8.zip)
|
| 37 |
|
| 38 |
For more device-specific assets and performance metrics, visit **[DeepLabXception on Qualcomm® AI Hub](https://aihub.qualcomm.com/models/deeplab_xception)**.
|
| 39 |
|
| 40 |
|
| 41 |
### Option 2: Export with Custom Configurations
|
| 42 |
|
| 43 |
+
Use the [Qualcomm® AI Hub Models](https://github.com/qualcomm/ai-hub-models/blob/main/src/qai_hub_models/models/deeplab_xception) Python library to compile and export the model with your own:
|
| 44 |
- Custom weights (e.g., fine-tuned checkpoints)
|
| 45 |
- Custom input shapes
|
| 46 |
- Target device and runtime configurations
|
| 47 |
|
| 48 |
This option is ideal if you need to customize the model beyond the default configuration provided here.
|
| 49 |
|
| 50 |
+
See our repository for [DeepLabXception on GitHub](https://github.com/qualcomm/ai-hub-models/blob/main/src/qai_hub_models/models/deeplab_xception) for usage instructions.
|
| 51 |
|
| 52 |
## Model Details
|
| 53 |
|
release_assets.json
CHANGED
|
@@ -1 +1 @@
|
|
| 1 |
-
{"version":"0.50.
|
|
|
|
| 1 |
+
{"version":"0.50.1","precisions":{"w8a8":{"universal_assets":{"tflite":{"tool_versions":{"qairt":"2.43.0.260127150333_193827","tflite":"2.17.0"},"download_url":"https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/deeplab_xception/releases/v0.50.1/deeplab_xception-tflite-w8a8.zip"},"qnn_dlc":{"tool_versions":{"qairt":"2.43.0.260127150333_193827"},"download_url":"https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/deeplab_xception/releases/v0.50.1/deeplab_xception-qnn_dlc-w8a8.zip"},"onnx":{"tool_versions":{"qairt":"2.42.0.251225135753_193295","onnx_runtime":"1.24.1"},"download_url":"https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/deeplab_xception/releases/v0.50.1/deeplab_xception-onnx-w8a8.zip"}}},"float":{"universal_assets":{"tflite":{"tool_versions":{"qairt":"2.43.0.260127150333_193827","tflite":"2.17.0"},"download_url":"https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/deeplab_xception/releases/v0.50.1/deeplab_xception-tflite-float.zip"},"qnn_dlc":{"tool_versions":{"qairt":"2.43.0.260127150333_193827"},"download_url":"https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/deeplab_xception/releases/v0.50.1/deeplab_xception-qnn_dlc-float.zip"},"onnx":{"tool_versions":{"qairt":"2.42.0.251225135753_193295","onnx_runtime":"1.24.1"},"download_url":"https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/deeplab_xception/releases/v0.50.1/deeplab_xception-onnx-float.zip"}}}}}
|