qaihm-bot commited on
Commit
7bc4259
·
verified ·
1 Parent(s): 2900031

See https://github.com/qualcomm/ai-hub-models/releases/v0.50.1 for changelog.

Files changed (2) hide show
  1. README.md +8 -8
  2. release_assets.json +1 -1
README.md CHANGED
@@ -15,7 +15,7 @@ pipeline_tag: image-classification
15
  Beit is a machine learning model that can classify images from the Imagenet dataset. It can also be used as a backbone in building more complex models for specific use cases.
16
 
17
  This is based on the implementation of Beit found [here](https://github.com/microsoft/unilm/tree/master/beit).
18
- This repository contains pre-exported model files optimized for Qualcomm® devices. You can use the [Qualcomm® AI Hub Models](https://github.com/qualcomm/ai-hub-models/blob/main/qai_hub_models/models/beit) library to export with custom configurations. More details on model performance across various devices, can be found [here](#performance-summary).
19
 
20
  Qualcomm AI Hub Models uses [Qualcomm AI Hub Workbench](https://workbench.aihub.qualcomm.com) to compile, profile, and evaluate this model. [Sign up](https://myaccount.qualcomm.com/signup) to run these models on a hosted Qualcomm® device.
21
 
@@ -28,25 +28,25 @@ Below are pre-exported model assets ready for deployment.
28
 
29
  | Runtime | Precision | Chipset | SDK Versions | Download |
30
  |---|---|---|---|---|
31
- | ONNX | float | Universal | QAIRT 2.42, ONNX Runtime 1.24.1 | [Download](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/beit/releases/v0.50.0/beit-onnx-float.zip)
32
- | ONNX | w8a16 | Universal | QAIRT 2.42, ONNX Runtime 1.24.1 | [Download](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/beit/releases/v0.50.0/beit-onnx-w8a16.zip)
33
- | QNN_DLC | float | Universal | QAIRT 2.43 | [Download](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/beit/releases/v0.50.0/beit-qnn_dlc-float.zip)
34
- | QNN_DLC | w8a16 | Universal | QAIRT 2.43 | [Download](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/beit/releases/v0.50.0/beit-qnn_dlc-w8a16.zip)
35
- | TFLITE | float | Universal | QAIRT 2.43, TFLite 2.17.0 | [Download](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/beit/releases/v0.50.0/beit-tflite-float.zip)
36
 
37
  For more device-specific assets and performance metrics, visit **[Beit on Qualcomm® AI Hub](https://aihub.qualcomm.com/models/beit)**.
38
 
39
 
40
  ### Option 2: Export with Custom Configurations
41
 
42
- Use the [Qualcomm® AI Hub Models](https://github.com/qualcomm/ai-hub-models/blob/main/qai_hub_models/models/beit) Python library to compile and export the model with your own:
43
  - Custom weights (e.g., fine-tuned checkpoints)
44
  - Custom input shapes
45
  - Target device and runtime configurations
46
 
47
  This option is ideal if you need to customize the model beyond the default configuration provided here.
48
 
49
- See our repository for [Beit on GitHub](https://github.com/qualcomm/ai-hub-models/blob/main/qai_hub_models/models/beit) for usage instructions.
50
 
51
  ## Model Details
52
 
 
15
  Beit is a machine learning model that can classify images from the Imagenet dataset. It can also be used as a backbone in building more complex models for specific use cases.
16
 
17
  This is based on the implementation of Beit found [here](https://github.com/microsoft/unilm/tree/master/beit).
18
+ This repository contains pre-exported model files optimized for Qualcomm® devices. You can use the [Qualcomm® AI Hub Models](https://github.com/qualcomm/ai-hub-models/blob/main/src/qai_hub_models/models/beit) library to export with custom configurations. More details on model performance across various devices, can be found [here](#performance-summary).
19
 
20
  Qualcomm AI Hub Models uses [Qualcomm AI Hub Workbench](https://workbench.aihub.qualcomm.com) to compile, profile, and evaluate this model. [Sign up](https://myaccount.qualcomm.com/signup) to run these models on a hosted Qualcomm® device.
21
 
 
28
 
29
  | Runtime | Precision | Chipset | SDK Versions | Download |
30
  |---|---|---|---|---|
31
+ | ONNX | float | Universal | QAIRT 2.42, ONNX Runtime 1.24.1 | [Download](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/beit/releases/v0.50.1/beit-onnx-float.zip)
32
+ | ONNX | w8a16 | Universal | QAIRT 2.42, ONNX Runtime 1.24.1 | [Download](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/beit/releases/v0.50.1/beit-onnx-w8a16.zip)
33
+ | QNN_DLC | float | Universal | QAIRT 2.43 | [Download](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/beit/releases/v0.50.1/beit-qnn_dlc-float.zip)
34
+ | QNN_DLC | w8a16 | Universal | QAIRT 2.43 | [Download](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/beit/releases/v0.50.1/beit-qnn_dlc-w8a16.zip)
35
+ | TFLITE | float | Universal | QAIRT 2.43, TFLite 2.17.0 | [Download](https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/beit/releases/v0.50.1/beit-tflite-float.zip)
36
 
37
  For more device-specific assets and performance metrics, visit **[Beit on Qualcomm® AI Hub](https://aihub.qualcomm.com/models/beit)**.
38
 
39
 
40
  ### Option 2: Export with Custom Configurations
41
 
42
+ Use the [Qualcomm® AI Hub Models](https://github.com/qualcomm/ai-hub-models/blob/main/src/qai_hub_models/models/beit) Python library to compile and export the model with your own:
43
  - Custom weights (e.g., fine-tuned checkpoints)
44
  - Custom input shapes
45
  - Target device and runtime configurations
46
 
47
  This option is ideal if you need to customize the model beyond the default configuration provided here.
48
 
49
+ See our repository for [Beit on GitHub](https://github.com/qualcomm/ai-hub-models/blob/main/src/qai_hub_models/models/beit) for usage instructions.
50
 
51
  ## Model Details
52
 
release_assets.json CHANGED
@@ -1 +1 @@
1
- {"version":"0.50.0","precisions":{"w8a16":{"universal_assets":{"qnn_dlc":{"tool_versions":{"qairt":"2.43.0.260127150333_193827"},"download_url":"https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/beit/releases/v0.50.0/beit-qnn_dlc-w8a16.zip"},"onnx":{"tool_versions":{"qairt":"2.42.0.251225135753_193295","onnx_runtime":"1.24.1"},"download_url":"https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/beit/releases/v0.50.0/beit-onnx-w8a16.zip"}}},"float":{"universal_assets":{"tflite":{"tool_versions":{"qairt":"2.43.0.260127150333_193827","tflite":"2.17.0"},"download_url":"https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/beit/releases/v0.50.0/beit-tflite-float.zip"},"qnn_dlc":{"tool_versions":{"qairt":"2.43.0.260127150333_193827"},"download_url":"https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/beit/releases/v0.50.0/beit-qnn_dlc-float.zip"},"onnx":{"tool_versions":{"qairt":"2.42.0.251225135753_193295","onnx_runtime":"1.24.1"},"download_url":"https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/beit/releases/v0.50.0/beit-onnx-float.zip"}}}}}
 
1
+ {"version":"0.50.1","precisions":{"w8a16":{"universal_assets":{"qnn_dlc":{"tool_versions":{"qairt":"2.43.0.260127150333_193827"},"download_url":"https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/beit/releases/v0.50.1/beit-qnn_dlc-w8a16.zip"},"onnx":{"tool_versions":{"qairt":"2.42.0.251225135753_193295","onnx_runtime":"1.24.1"},"download_url":"https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/beit/releases/v0.50.1/beit-onnx-w8a16.zip"}}},"float":{"universal_assets":{"tflite":{"tool_versions":{"qairt":"2.43.0.260127150333_193827","tflite":"2.17.0"},"download_url":"https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/beit/releases/v0.50.1/beit-tflite-float.zip"},"qnn_dlc":{"tool_versions":{"qairt":"2.43.0.260127150333_193827"},"download_url":"https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/beit/releases/v0.50.1/beit-qnn_dlc-float.zip"},"onnx":{"tool_versions":{"qairt":"2.42.0.251225135753_193295","onnx_runtime":"1.24.1"},"download_url":"https://qaihub-public-assets.s3.us-west-2.amazonaws.com/qai-hub-models/models/beit/releases/v0.50.1/beit-onnx-float.zip"}}}}}