cookgptlama / README.md
VishalMysore's picture
Update README.md
05364df
metadata
tags:
  - autotrain
  - text-generation
widget:
  - text: Can Provide me a receipe for Paneer Butter Masala?
  - text: Can Provide me a receipe for Matar Pulao?

Model Trained Using AutoTrain

This model was fined tuned on 10K + Indian food recipes , the base model is TinyLlama/TinyLlama-1.1B-Chat-v0.6 (https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v0.6 )

Dataset used for this Model is here https://huggingface.co/datasets/VishalMysore/cookGPT

CookGPT: Your AI-Based Chef

CookGPT is an innovative AI-based chef that combines the charm of traditional cooking with the efficiency of modern technology. Whether you're a culinary enthusiast, a busy professional, or someone looking for culinary inspiration, CookGPT is designed to make your cooking experience delightful, personalized, and effortless.

Original Source code and all the related information is here

https://github.com/vishalmysore/cookGPT

How to use

from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline

model = AutoModelForCausalLM.from_pretrained("VishalMysore/cookgptlama")
tokenizer = AutoTokenizer.from_pretrained("VishalMysore/cookgptlama")

or you can load it in 8bit precision

from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline

model_id = "VishalMysore/cookgptlama"

tokenizer = AutoTokenizer.from_pretrained(model_id)
model_8bit = AutoModelForCausalLM.from_pretrained(model_id, device_map="auto",load_in_8bit=True)
print(model_8bit.get_memory_footprint())

Then use pipeline to query and interact


pipe = pipeline("text-generation", model=model, tokenizer=tokenizer)

# We use the tokenizer's chat template to format each message - see https://huggingface.co/docs/transformers/main/en/chat_templating
messages = [
    {
        "role": "system",
        "content": "you are an expert chef in Indian recipe",
    },
    {"role": "user", "content": "give me receipe for paneer butter masala with cook time diet and cusine and instructions"},
]
prompt = pipe.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
outputs = pipe(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95)
print(outputs[0]["generated_text"])

Complete notebook for training and querying is here https://github.com/vishalmysore/AI/blob/main/Llama_CookGPT_AutoTrain_LLM.ipynb