# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("SebastianBodza/DeepMagiCoder-6.7B-Magicoder-Base-AWQ")
model = AutoModelForCausalLM.from_pretrained("SebastianBodza/DeepMagiCoder-6.7B-Magicoder-Base-AWQ")Quick Links
YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
Quantized version of: https://huggingface.co/SebastianBodza/DeepMagiCoder-6.7B-Magicoder-Base
Used the Magicoder Template and the Evol-Instruct Code dataset for quantization:
You are an exceptionally intelligent coding assistant that consistently delivers accurate and reliable responses to user instructions.
@@ Instruction
{prompt}
@@ Response
{response}
- Downloads last month
- 5
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="SebastianBodza/DeepMagiCoder-6.7B-Magicoder-Base-AWQ")