YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

Qualcomm AI Runtime โ€” V75 Android arm64 (v2.46.0.260424)

Mirror of the Qualcomm AI Runtime SDK Community Edition runtime libraries needed to execute precompiled-QNN-ONNX bundles on Snapdragon 8 Gen 3 (Hexagon V75) Android devices.

This mirror exists so the Sona Forge app has a single, stable download endpoint under huggingface.co/sona-forge/ for every binary it needs at runtime. Sona Forge is a local-first Android image-transformation app and never phones home to any provider.

Files

File Side Bytes Purpose
libQnnHtp.so App / arm64-v8a 3,495,096 Generic HTP backend dispatcher
libQnnSystem.so App / arm64-v8a 3,864,784 QNN system support
libQnnHtpV75Stub.so App / arm64-v8a 736,288 V75 dispatcher stub (CPU side)
libQnnHtpPrepare.so App / arm64-v8a 90,431,072 Graph compiler / preparation pass
libQnnHtpV75.so DSP / hexagon-v75 11,388,152 V75 backend (loaded onto Hexagon)
libQnnHtpV75Skel.so DSP / hexagon-v75 11,085,052 V75 FastRPC skel (loaded onto Hexagon)

The four App libs are linked against Android bionic and load on the application processor via dlopen / System.load. The two DSP libs are Hexagon DSP6 binaries loaded onto the Hexagon NPU itself by FastRPC.

Source

Extracted from the official Qualcomm AI Runtime SDK Community Edition v2.46.0, specifically the artefacts under:

  • qairt/2.46.0.260424/lib/aarch64-android/
  • qairt/2.46.0.260424/lib/hexagon-v75/unsigned/

Licence

Distributed under the Qualcomm AI Engine Direct SDK License Agreement that covers QAIRT Community Edition. See LICENSE for the redistribution terms. Sona Forge end-users acquire the runtime on their device when the app fetches this pack; the download itself is the redistribution event.

Compatibility

QAIRT runtime is back-compat across the 2.x minor series. v2.46 runtime loads context binaries compiled against v2.42 (and later) Workbench releases.

See also

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support