Sankie005's picture
Upload 435 files
143eb4d

You can set up a server to use computer vision models with Inference on the following devices:

  • ARM CPU (macOS, Raspberry Pi)
  • x86 CPU (macOS, Linux, Windows)
  • NVIDIA GPU
  • NVIDIA Jetson (JetPack 4.5.x, JetPack 4.6.x, JetPack 5.x)

You can also run Inference in the cloud with the Roboflow hosted inference offering.

To learn more about device compatability with different models, refer to the model compatability matrix.