-
-
-
-
-
-
Inference Providers
Active filters:
tool-use
QuantFactory/miscii-14b-1028-GGUF
Text Generation
•
15B
•
Updated
•
89
•
3
legionarius/watt-tool-8B-GGUF
8B
•
Updated
•
70
•
2
TimeLordRaps/watt-tool-8B-Q8_0-GGUF
8B
•
Updated
•
3
Nekuromento/watt-tool-8B-Q4_K_M-GGUF
8B
•
Updated
•
1
Nekuromento/watt-tool-8B-Q5_K_M-GGUF
8B
•
Updated
•
1
Nekuromento/watt-tool-8B-Q6_K-GGUF
8B
•
Updated
•
3
Nekuromento/watt-tool-8B-Q8_0-GGUF
8B
•
Updated
•
5
ejschwartz/watt-tool-8B-Q4_K_M-GGUF
8B
•
Updated
•
3
dwetzel/watt-tool-70B-GPTQ-INT4
11B
•
Updated
•
1
mlx-community/watt-tool-8B
Updated
•
20
•
4
mradermacher/watt-tool-8B-GGUF
8B
•
Updated
•
100
•
5
mradermacher/watt-tool-8B-i1-GGUF
8B
•
Updated
•
111
•
1
mradermacher/watt-tool-70B-GGUF
71B
•
Updated
•
32
mradermacher/watt-tool-70B-i1-GGUF
71B
•
Updated
•
158
Tonic/c4ai-command-a-03-2025-4bit_nf4_double
Text Generation
•
114B
•
Updated
•
4
Tonic/c4ai-command-a-03-2025-4bit_fp4
Text Generation
•
113B
•
Updated
•
5
Tonic/c4ai-command-a-03-2025-4bit_nf4_no_double
Text Generation
•
113B
•
Updated
•
2
fuzzy-mittenz/watt-tool-8B-Q4_K_M-GGUF
8B
•
Updated
•
4
Scotto2025/watt-tool-70B-mlx-6Bit
15B
•
Updated
•
7
Scotto2025/watt-tool-8B-mlx-8Bit
2B
•
Updated
•
7
Salesforce/Llama-xLAM-2-70b-fc-r
Text Generation
•
71B
•
Updated
•
241
•
48
Salesforce/Llama-xLAM-2-8b-fc-r
Text Generation
•
8B
•
Updated
•
76.6k
•
•
58
Salesforce/xLAM-2-32b-fc-r
Text Generation
•
33B
•
Updated
•
783
•
•
32
Salesforce/xLAM-2-3b-fc-r
Text Generation
•
3B
•
Updated
•
70.5k
•
16
Salesforce/xLAM-2-1b-fc-r
Text Generation
•
2B
•
Updated
•
44.1k
•
•
12
Salesforce/xLAM-2-3b-fc-r-gguf
Text Generation
•
3B
•
Updated
•
254
•
6
Salesforce/xLAM-2-1b-fc-r-gguf
Text Generation
•
2B
•
Updated
•
1.32k
•
2
Salesforce/Llama-xLAM-2-8b-fc-r-gguf
Text Generation
•
8B
•
Updated
•
512
•
18
bartowski/watt-ai_watt-tool-70B-GGUF
Text Generation
•
71B
•
Updated
•
107
Ronny/xLAM-2-32b-fc-r-Q4_K_M-GGUF
Text Generation
•
33B
•
Updated