aoiandroid's picture
Upload README.md with huggingface_hub
9b716c3 verified
metadata
language:
  - en
  - sw
  - ha
  - yo
  - ig
  - am
  - zu
  - xh
  - af
  - so
  - rw
  - sn
  - tw
  - ee
  - wo
  - ny
  - ti
  - nso
  - tn
  - om
  - ve
  - nd
  - ar
  - fr
  - pt
  - es
  - de
  - zh
  - ja
  - ko
license: mit
tags:
  - translation
  - mlx
  - apple-silicon
  - multilingual
  - african-languages
pipeline_tag: translation
library_name: mlx

TranslateBlue v2 (MLX 4-bit)

Translation model focused on 29 languages with emphasis on African languages, in MLX 4-bit format for Apple Silicon (M1+ Mac, and mlx-swift on supported devices).

Model description

  • Base model: Qwen3-4B-Instruct
  • Format: MLX, 4-bit quantized
  • Size: ~2.1 GB
  • Training: LoRA fine-tuning on parallel translation data (10,000 steps, 16 LoRA layers)
  • Training data: 563,986 sentence pairs from 29 languages

Intended use

  • Text translation between the supported languages, especially to/from African languages
  • Offline translation on Mac (and in apps using mlx-swift where supported)
  • Low-latency translation on Apple Silicon with Metal acceleration

Supported languages (29)

Code Language Code Language Code Language
sw Swahili ha Hausa yo Yoruba
ig Igbo am Amharic zu Zulu
xh Xhosa af Afrikaans so Somali
rw Kinyarwanda sn Shona tw Twi
ee Ewe wo Wolof ny Chichewa
ti Tigrinya nso Northern Sotho tn Tswana
om Oromo ve Venda nd Ndebele
ar Arabic fr French pt Portuguese
es Spanish de German zh Chinese
ja Japanese ko Korean en English

Limitations

  • Apple Silicon only for this MLX build (Mac with M1 or later; mlx-swift on supported iOS/iPadOS when available).
  • Best for short to medium sentences; very long texts may lose quality.
  • Low-resource pairs may be less accurate than high-resource ones.
  • No built-in language detection; source and target languages should be specified in the prompt.

How to use

Prompt format

Use a clear translation instruction, for example:

Translate from English to Swahili:

Hello, how are you?

With Python (mlx-lm)

pip install mlx mlx-lm
from mlx_lm import load, generate
from mlx_lm.sample_utils import make_sampler

model, tokenizer = load("aoiandroid/TranslateBlue-v2-MLX-4bit")
sampler = make_sampler(temp=0.3, top_p=0.9)

messages = [{"role": "user", "content": "Translate from English to Swahili:\n\nHello, how are you?"}]
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
response = generate(model, tokenizer, prompt=prompt, max_tokens=64, sampler=sampler, verbose=False)
print(response)

With Swift (mlx-swift / TranslateBlue)

The model is registered as TranslateBlue v2 (MLX). After downloading via the app (or placing the model in the expected path), it runs with MLXModelService using the same prompt format above.

Training details

Setting Value
Base model Qwen3-4B-Instruct
Method LoRA
LoRA layers 16
Steps 10,000
Training samples 563,986
Validation loss ~2.5

Related models

License

MIT.

Citation

If you use this model in research or a product, please cite the base model (Qwen3) and the TranslateBlue project as appropriate.