Inference Providers
Active filters: arm64
cudabenchmarktest/personaplex-7b-turbo2bit
Updated • 83
• 2
vlad-m-dev/distiluse-base-multilingual-v2-merged-onnx
Feature Extraction
• Updated • 1
onnx-community/distiluse-base-multilingual-v2-merged-onnx
Feature Extraction
• Updated • 1
halley-ai/gpt-oss-20b-MLX-4bit-gs32
Text Generation
• 21B • Updated • 81
• 2
halley-ai/gpt-oss-20b-MLX-6bit-gs32
Text Generation
• 21B • Updated • 66
• 1
halley-ai/gpt-oss-20b-MLX-5bit-gs32
Text Generation
• 21B • Updated • 48
• 1
halley-ai/gpt-oss-120b-MLX-8bit-gs32
Text Generation
• 117B • Updated • 108
• 1
halley-ai/gpt-oss-120b-MLX-bf16
Text Generation
• 117B • Updated • 386
• 3
halley-ai/gpt-oss-120b-MLX-6bit-gs64
Text Generation
• 117B • Updated • 110
• 1
halley-ai/Qwen3-Next-80B-A3B-Instruct-MLX-4bit-gs64
Text Generation
• 80B • Updated • 23
• 1
halley-ai/Qwen3-Next-80B-A3B-Instruct-MLX-5bit-gs32
Text Generation
• 80B • Updated • 23
• 1
halley-ai/Qwen3-Next-80B-A3B-Instruct-MLX-6bit-gs64
Text Generation
• 80B • Updated • 9
• 1
mjbommar/glaurung-binary-tokenizer-001
Feature Extraction
• Updated mjbommar/glaurung-binary-tokenizer-002
Feature Extraction
• Updated • 1
Hellohal2064/vllm-dgx-spark-gb10
Text Generation
• Updated • 2
thehighnotes/vllm-jetson-orin
Text Generation
• Updated