How to use from the
Use from the
Transformers library
# Use a pipeline as a high-level helper
from transformers import pipeline

pipe = pipeline("question-answering", model="pathikg/DogLLaMA_7b")
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("pathikg/DogLLaMA_7b")
model = AutoModelForCausalLM.from_pretrained("pathikg/DogLLaMA_7b")
Quick Links

DogLLaMA-7b

DogLLaMA is a language model specifically designed to translate user queries into the language of dogs. Built on the LLaMA-7b architecture and DogLLaMA-small dataset generated using gpt3.5, this model provides responses in a playful and enthusiastic manner, mimicking the communication style of our furry friends. Whether you're curious about a dog's perspective on jokes, treats, or the weather, DogLLaMA is here to bark out the answers!

Downloads last month
12
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train pathikg/DogLLaMA_7b