# Model Card for Frontida's T5-Small Model ## Introduction This model card presents an overview of the `t5-small` model as adapted and fine-tuned for Frontida, a project dedicated to supporting new mothers through the challenges of postpartum depression. Frontida leverages the `t5-small` model to understand and respond to user queries with empathy and accuracy. ## Model Details ### Model Description The `t5-small` model, developed by Google and fine-tuned by the Frontida team, serves as the backbone of our chatbot's natural language processing capabilities. This version of the T5 model is optimized for efficiency, enabling quick and reliable responses within our application. It has been adapted to specifically address the nuances and complexities of conversations surrounding postpartum depression. - **Developed by:** Google, with fine-tuning by the Frontida team - **Model type:** Text-to-Text Transfer Transformer (T5) Small - **Language(s) (NLP):** Primarily English, with plans to support additional languages - **License:** Apache 2.0 - **Finetuned from:** Google’s original `t5-small` model ### Model Sources - **Repository:** Available on Hugging Face (link to Frontida’s repository) - **Paper:** "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer" (Raffel et al., 2019) - **Demo:** Frontida Chatbot Interface (link to demo if available) ### Team - **Danroy Mwangi** - Team Lead and NLP Lead - **Maria Muthiore** - Backend Lead - **Nelson Kamau** - Frontend Lead