YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
A6B Assistant
Model Description
- Model Type: Code Generation
- Base Model: deepseek-ai/deepseek-coder-1.3b-base
- Language: English
- Developer: Balakarthikeyan
- Organization: A6B
Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
# Load model and tokenizer
model = AutoModelForCausalLM.from_pretrained("bala00712200502/A6B7B")
tokenizer = AutoTokenizer.from_pretrained("bala00712200502/A6B7B")
# Example usage
def generate_response(prompt):
inputs = tokenizer(prompt, return_tensors="pt")
outputs = model.generate(
inputs["input_ids"],
max_length=512,
temperature=0.7,
top_p=0.95,
do_sample=True
)
response = tokenizer.decode(outputs[0], skip_special_tokens=True)
return response
# Example usage
response = generate_response("Hello, how can I help you?")
print(response)
Model Details
Training Infrastructure
- Training Framework: PyTorch with ๐ค Transformers
- Hardware: Not specified
License
This model is for research purposes only.
Developer
Created by Balakarthikeyan at A6B
- Downloads last month
- 10
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support