W2V-BERT-2.0 ASR Adapters - African Languages
Per-language bottleneck adapters for multilingual ASR using w2v-bert-2.0.
Model Architecture
- Base: facebook/w2v-bert-2.0 (frozen, 600M params)
- Adapters: MMS-style bottleneck (dim=64, ~2M params each)
- Decoder: Lightweight transformer (1 layer, ~6M params)
- Per-language: Vocabulary and LM head
Available Adapters
| Adapter ID | Language | WER | Train Samples |
|---|---|---|---|
| Training in progress... |
Training Progress
- Completed: 0/5
- Samples per adapter: 15,000
- Epochs: 5
Usage
from huggingface_hub import snapshot_download
# Download model
model_dir = snapshot_download("mutisya/w2v-bert-asr-african-languages")
# Load specific adapter
adapter_path = f"{model_dir}/adapters/swh_Latn_v1"
License
Apache 2.0
Model tree for mutisya/w2v-bert-asr-african-languages
Base model
facebook/w2v-bert-2.0