-
-
-
-
-
-
Inference Providers
Active filters: exaone
second-state/EXAONE-Deep-7.8B-GGUF
Text Generation
• 8B • Updated
• 37
• 1
gaianet/EXAONE-Deep-7.8B-GGUF
Text Generation
• 8B • Updated
• 6
BlackBeenie/EXAONE-Deep-32B-Q4_K_M-GGUF
Text Generation
• 32B • Updated
• 7
Mungert/EXAONE-Deep-2.4B-GGUF
Text Generation
• 2B • Updated
• 143
• 3
john9815/EXAONE-3.5-7.8B-Instruct-Llamafied-Q4_K_M-GGUF
Text Generation
• 8B • Updated
• 6
Mungert/EXAONE-Deep-7.8B-GGUF
Text Generation
• 8B • Updated
• 370
• 5
cococoomo/Exaone3.5-7.8B_ReST_V0_Quantized
8B • Updated
AGCobra/EXAONE-Deep-32B-mlx-4Bit
Text Generation
• 5B • Updated
• 4
mlx-community/EXAONE-Deep-32B-mlx-8Bit
Text Generation
• 9B • Updated
• 2
second-state/EXAONE-3.5-32B-Instruct-GGUF
Text Generation
• 32B • Updated
• 82
gaianet/EXAONE-3.5-32B-Instruct-GGUF
Text Generation
• 32B • Updated
• 8
second-state/EXAONE-3.5-7.8B-Instruct-GGUF
Text Generation
• 8B • Updated
• 62
gaianet/EXAONE-3.5-7.8B-Instruct-GGUF
Text Generation
• 8B • Updated
• 34
second-state/EXAONE-3.5-2.4B-Instruct-GGUF
Text Generation
• 2B • Updated
• 26
gaianet/EXAONE-3.5-2.4B-Instruct-GGUF
Text Generation
• 2B • Updated
• 26
tensorblock/EXAONE-Deep-7.8B-GGUF
Text Generation
• 8B • Updated
• 58
RichardErkhov/LGAI-EXAONE_-_EXAONE-Deep-2.4B-awq
2B • Updated
QuantFactory/EXAONE-Deep-2.4B-GGUF
Text Generation
• 3B • Updated
• 3
• 2
tensorblock/EXAONE-Deep-2.4B-GGUF
Text Generation
• 2B • Updated
• 25
RichardErkhov/LGAI-EXAONE_-_EXAONE-3.5-2.4B-Instruct-awq
2B • Updated
Mungert/EXAONE-Deep-32B-GGUF
Text Generation
• 32B • Updated
• 61
• 5
QuantFactory/EXAONE-Deep-7.8B-GGUF
Text Generation
• 8B • Updated
• 42
• 3
werty1248/EXAONE-Deep-7.8B-Ko-Thought-test
Text Generation
• 8B • Updated
• 2
• 1
ysn-rfd/EXAONE-Deep-2.4B-GGUF
Text Generation
• 2B • Updated
• 2
• 1
RichardErkhov/LGAI-EXAONE_-_EXAONE-Deep-7.8B-awq
8B • Updated
Echo9Zulu/EXAONE-Deep-2.4B-int8_asym-ov
Updated
Echo9Zulu/EXAONE-Deep-2.4B-int4_asym-gptq-se-ov
Updated
good593/EXAONE-3.5-2.4B-fine-tuning
Text Generation
• Updated
• 2
Echo9Zulu/EXAONE-Deep-7.8B-int4_asym-gptq-se-ov
Youseff1987/EXAONE-Deep-7.8B-bnb-4bit
Text Generation
• 8B • Updated