How to use USER9724/HomeLlama-8B with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="USER9724/HomeLlama-8B")
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("USER9724/HomeLlama-8B", dtype="auto")
This model is trained on unixyhuang/SmartHome-Device-QA dataset for smart home assistant usage.
The base model is llama-3-8B.
The fine-tuning method is QLoRA.
-