Sankie005's picture
Upload 434 files
c446951
You can set up a server to use computer vision models with Inference on the following devices:
- ARM CPU (macOS, Raspberry Pi)
- x86 CPU (macOS, Linux, Windows)
- NVIDIA GPU
- NVIDIA Jetson (JetPack 4.5.x, JetPack 4.6.x, JetPack 5.x)
You can also run Inference in the cloud with the [Roboflow hosted inference offering](https://docs.roboflow.com/deploy/hosted-api).
To learn more about device compatability with different models, refer to the [model compatability matrix](/quickstart/compatability_matrix).