Mistral 7B Fine-tuned Model
This is a fine-tuned version of Mistral 7B on a custom dataset.
It can be used for text generation tasks.
Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "dexter191/mistral-7b-finetuned"
model = AutoModelForCausalLM.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
input_text = "Hello, how can I help you?"
inputs = tokenizer(input_text, return_tensors="pt")
output = model.generate(**inputs)
print(tokenizer.decode(output[0]))
- Downloads last month
- 1