Inference Providers
Active filters: arm64
cudabenchmarktest/personaplex-7b-turbo2bit
Updated • 57
• 2
vlad-m-dev/distiluse-base-multilingual-v2-merged-onnx
Feature Extraction
• Updated • 1
onnx-community/distiluse-base-multilingual-v2-merged-onnx
Feature Extraction
• Updated • 1
halley-ai/gpt-oss-20b-MLX-4bit-gs32
Text Generation
• 21B • Updated • 80
• 2
halley-ai/gpt-oss-20b-MLX-6bit-gs32
Text Generation
• 21B • Updated • 66
• 1
halley-ai/gpt-oss-20b-MLX-5bit-gs32
Text Generation
• 21B • Updated • 49
• 1
halley-ai/gpt-oss-120b-MLX-8bit-gs32
Text Generation
• 117B • Updated • 106
• 1
halley-ai/gpt-oss-120b-MLX-bf16
Text Generation
• 117B • Updated • 391
• 3
halley-ai/gpt-oss-120b-MLX-6bit-gs64
Text Generation
• 117B • Updated • 109
• 1
halley-ai/Qwen3-Next-80B-A3B-Instruct-MLX-4bit-gs64
Text Generation
• 80B • Updated • 22
• 1
halley-ai/Qwen3-Next-80B-A3B-Instruct-MLX-5bit-gs32
Text Generation
• 80B • Updated • 25
• 1
halley-ai/Qwen3-Next-80B-A3B-Instruct-MLX-6bit-gs64
Text Generation
• 80B • Updated • 9
• 1
mjbommar/glaurung-binary-tokenizer-001
Feature Extraction
• Updated mjbommar/glaurung-binary-tokenizer-002
Feature Extraction
• Updated • 1
Hellohal2064/vllm-dgx-spark-gb10
Text Generation
• Updated • 2
thehighnotes/vllm-jetson-orin
Text Generation
• Updated