llm.create_chat_completion(
messages = "No input example has been defined for this model task."
)- Downloads last month
- 17
Hardware compatibility
Log In to add your hardware
2-bit
8-bit
16-bit
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for darwinkernelpanic/deepseek-coder-6.7b-instruct-luau-gguf
Base model
deepseek-ai/deepseek-coder-6.7b-instruct
# !pip install llama-cpp-python from llama_cpp import Llama llm = Llama.from_pretrained( repo_id="darwinkernelpanic/deepseek-coder-6.7b-instruct-luau-gguf", filename="", )