Smol-AI-Africa: The Kano Edition (v1.0) πŸŒπŸ‡³πŸ‡¬

Lead Developer: Ahmad Garba Adamu (AGABOT-99) System Architecture: SmolLM2-135M (Fine-tuned via PEFT/LoRA) Operational Target: 2GB RAM Mobile SoC (Low-Power ARMv8)


πŸ—οΈ 1. Technical Abstract

Smol-AI-Africa represents a breakthrough in Low-Resource Natural Language Processing (LR-NLP). While modern LLMs are scaled toward trillion-parameter architectures, this project focuses on Extreme Optimization for the African digital frontier.

πŸ”¬ 2. Engineering Methodology: 'Delicate Anchoring'

2.1 Low-Rank Adaptation (LoRA) Parameters

We avoid full-parameter updates to prevent Catastrophic Forgetting. We apply a low-rank decomposition to the weight updates: W=W0+Ξ”W=W0+BAW = W_0 + \Delta W = W_0 + BA Using a Rank (r) of 16 and Alpha of 32, we target the q_proj and v_proj modules for maximum efficiency on 2GB RAM devices.

🌍 3. Socio-Economic Impact

3.1 Mobile Accessibility

In Kanoβ€”Africa's commercial nerve centerβ€”hardware is the gatekeeper. By optimizing for 2GB RAM, we ensure this AI runs on second-hand smartphones and older Android devices used by the 'Street Smart' hustle.

πŸ“Š 4. Performance Specifications

Metric Specification
Parameter Count 135 Million
RAM Requirement < 2GB
Inference Speed ~20 tokens/sec

πŸš€ 5. Implementation & Usage

To engage the persona, use the anchor code:

KANO-CORE-77 [Your Question]

πŸŽ“ About the Architect

Ahmad Garba Adamu (AGABOT-99) is an AI Researcher from Kano, Nigeria, building 'Glocal' solutions for the people.

Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F16
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Space using rufatronics/Smol-AI-Africa 1