MamaCare AI - Maternal Health Chatbot for Africa
Fine-tuned TinyLlama-1.1B-Chat using LoRA (float16) for maternal health Q&A, focused on pregnancy, childbirth, breastfeeding, and dispelling harmful myths.
Training Details
- Base model: TinyLlama/TinyLlama-1.1B-Chat-v1.0
- Method: LoRA (r=16, alpha=32) in float16
- Dataset: ~3,000 maternal health Q&A from 3 sources
- Evaluation: BLEU-4, ROUGE-1/2/L, Perplexity
Usage
from peft import PeftModel
from transformers import AutoModelForCausalLM, AutoTokenizer
base = AutoModelForCausalLM.from_pretrained("TinyLlama/TinyLlama-1.1B-Chat-v1.0")
model = PeftModel.from_pretrained(base, "kerie1/mamacareai")
tokenizer = AutoTokenizer.from_pretrained("kerie1/mamacareai")
Inference Providers NEW
This model isn't deployed by any Inference Provider. π Ask for provider support
Model tree for Kerie1/mamacareai
Base model
TinyLlama/TinyLlama-1.1B-Chat-v1.0