|
|
--- |
|
|
base_model: deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B |
|
|
library_name: transformers |
|
|
model_name: askubuntu-model |
|
|
tags: |
|
|
- sft |
|
|
- unsloth |
|
|
- trl |
|
|
- deepseek |
|
|
- qwen |
|
|
licence: agpl-3.0 |
|
|
datasets: |
|
|
- maifeeulasad/askubuntu-data |
|
|
--- |
|
|
# Model Card for askubuntu-model |
|
|
|
|
|
This model is a fine-tuned version of [deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B](https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B). |
|
|
|
|
|
## Quick start |
|
|
|
|
|
```python |
|
|
from transformers import AutoTokenizer, AutoModelForCausalLM |
|
|
from peft import PeftModel |
|
|
|
|
|
base_model_id = "deepseek-ai/DeepSeek-R1-Distill-Qwen-1.5B" |
|
|
peft_model_id = "maifeeulasad/askubuntu-model" |
|
|
|
|
|
model = AutoModelForCausalLM.from_pretrained( |
|
|
base_model_id, |
|
|
device_map="auto", |
|
|
trust_remote_code=True, |
|
|
) |
|
|
|
|
|
model = PeftModel.from_pretrained(model, peft_model_id) |
|
|
tokenizer = AutoTokenizer.from_pretrained(base_model_id) |
|
|
|
|
|
from transformers import pipeline |
|
|
generator = pipeline("text-generation", model=model, tokenizer=tokenizer) |
|
|
|
|
|
question = "Tell me how to install rootless docker on ubuntu 18 LTS?" |
|
|
output = generator(question, max_new_tokens=16384, return_full_text=False)[0]["generated_text"] |
|
|
print(output) |
|
|
|
|
|
``` |