🇬🇭 Twi-Llama V5 (Ama)
The first bilingual Twi-English conversational AI assistant.
Created by Angelo Asante (www.angeloasante.com)
Model Description
Ama is a fine-tuned LLaMA 3.1 8B model that speaks Twi (Ghana's most widely spoken language) and English. Trained on 2M+ curated conversations covering greetings, culture, translations, math, current events, and more.
Capabilities
- 💬 Natural conversations in Twi and English
- 🔄 Translate between Twi ↔ English
- 🧮 Answer math questions
- 📚 Explain Ghanaian culture and traditions
- 🌍 Discuss current events (Ghana, world news)
- ❓ General knowledge Q&A
Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
# Load model
model = AutoModelForCausalLM.from_pretrained(
"travis-moore/twi-llama-v5",
torch_dtype=torch.float16,
device_map="auto"
)
tokenizer = AutoTokenizer.from_pretrained("travis-moore/twi-llama-v5")
# Chat function
def chat(message):
messages = [
{"role": "system", "content": "You are Ama, a helpful Twi AI assistant created by Angelo Asante."},
{"role": "user", "content": message}
]
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
outputs = model.generate(
**inputs,
max_new_tokens=150,
temperature=0.7,
do_sample=True,
pad_token_id=tokenizer.eos_token_id
)
response = tokenizer.decode(outputs[0], skip_special_tokens=True)
return response.split("assistant")[-1].strip()
# Try it!
print(chat("Wo ho te sɛn?"))
print(chat("Who created you?"))
print(chat("Translate 'I love you' to Twi"))
Example Conversations
| Input | Ama's Response |
|---|---|
| Wo ho te sɛn? | Me ho yɛ! Wo nso ɛ? |
| What is your name? | Me din de Ama. Meyɛ Twi AI assistant. |
| Who created you? | Angelo Asante na ɔbɔɔ me. |
| Translate "I love you" to Twi | "Me dɔ wo" |
| What is 5 + 3? | 5 + 3 = 8 |
| Who is the president of Ghana? | John Dramani Mahama yɛ Ghana President. |
| Maakye! | Maakye! Wo ho te sɛn ɛnnɛ? |
Training Details
| Metric | Value |
|---|---|
| Base Model | meta-llama/Llama-3.1-8B-Instruct |
| Fine-tuning Method | LoRA (merged) |
| Training Data | 2M+ conversations |
| Languages | Twi (Akan), English |
| Epochs | 2 |
| Learning Rate | 1e-4 |
Training Topics
The model was trained on 21 diverse categories:
- Greetings & Pleasantries
- Ghanaian Culture & Traditions
- Identity (Who is Ama?)
- Code-switching (Twi/English mix)
- Family & Relationships
- Travel & Directions
- Shopping & Money
- Health & Wellness
- Weather & Seasons
- Sports & Football
- Education & Learning
- Politics & Government
- Work & Career
- Music & Entertainment
- Technology & Phones
- Religion & Spirituality
- Compliments & Encouragement
- Animals & Nature
- Numbers & Counting
- Home & Daily Life
- General Intelligence (math, colors, translations)
Limitations
- Primary focus is Twi (Akan dialect) - may not handle other Ghanaian languages well
- Knowledge cutoff based on training data (2024-2025)
- May occasionally mix up Twi dialects
- Best for conversational use, not long-form content
Intended Use
- Twi language learning and practice
- Translation assistance
- Cultural education about Ghana
- Conversational AI for Twi speakers
- Research on African language AI
Ethical Considerations
This model is designed to be helpful, harmless, and honest. It will not generate harmful, hateful, or inappropriate content. When asked questions it cannot answer, it will acknowledge its limitations.
Citation
@misc{twi-llama-v5,
author = {Angelo Asante},
title = {Twi-Llama V5 (Ama): A Bilingual Twi-English Conversational AI},
year = {2025},
publisher = {HuggingFace},
url = {https://huggingface.co/travis-moore/twi-llama-v5}
}
Acknowledgments
- Meta AI for LLaMA 3.1
- Hugging Face for hosting and tools
- The Twi-speaking community of Ghana
Made with ❤️ for Ghana 🇬🇭
- Downloads last month
- 1,925
Model tree for travis-moore/twi-llama-v5
Base model
meta-llama/Llama-3.1-8B
Finetuned
meta-llama/Llama-3.1-8B-Instruct