moxin-chat-7b-GGUF
Original Model
Run with Gaianet
Prompt template:
prompt template: moxin-chat
Reverse prompt
reverse prompt: [INST]
Context size:
chat_ctx_size: 32000
Run with GaiaNet:
Quick start: https://docs.gaianet.ai/node-guide/quick-start
Customize your node: https://docs.gaianet.ai/node-guide/customize
Quantized with llama.cpp b4273
- Downloads last month
- 47
Hardware compatibility
Log In to add your hardware
2-bit
3-bit
4-bit
5-bit
6-bit
8-bit
16-bit
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support
Model tree for gaianet/moxin-chat-7b-GGUF
Base model
moxin-org/Moxin-7B-Chat