|
|
--- |
|
|
license: mit |
|
|
tags: |
|
|
- Transformer |
|
|
- ONNX |
|
|
- clip |
|
|
--- |
|
|
|
|
|
# CLIP |
|
|
|
|
|
- [Original Repository: openai/CLIP](https://github.com/openai/CLIP) |
|
|
|
|
|
## How to convert ONNX to axmodel |
|
|
|
|
|
For those who are interested in model conversion, you can try to export axmodel through the original repo: |
|
|
|
|
|
- [clip.axera: export onnx from clip repo](https://github.com/AXERA-TECH/clip.axera) |
|
|
|
|
|
- [Pulsar2 Link: How to Convert from ONNX to axmodel](https://pulsar2-docs.readthedocs.io/en/latest/) |
|
|
|
|
|
## How to deploy the axmodel |
|
|
|
|
|
- [AXera NPU HOST Runtime](https://github.com/AXERA-TECH/CLIP-ONNX-AX650-CPP) |
|
|
|
|
|
## Support Platform |
|
|
|
|
|
- AX650 |
|
|
- [M4N-Dock(爱芯派Pro)](https://wiki.sipeed.com/hardware/zh/maixIV/m4ndock/m4ndock.html) |
|
|
- [M.2 Accelerator card](https://axcl-docs.readthedocs.io/zh-cn/latest/doc_guide_hardware.html) |
|
|
|
|
|
## Inference with AX650 Host |
|
|
|
|
|
- check the [on board reference](https://github.com/AXERA-TECH/CLIP-ONNX-AX650-CPP) for more information |