File size: 2,545 Bytes
f068e5c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
---
license: cc-by-4.0
tags:
- translation
- marian
- opus-mt
- windyword
language:
- afa
library_name: transformers
pipeline_tag: translation
---

# WindyWord.ai Translation — afa → afa

**Quality Rating: ⭐⭐½  (2.5★ Basic)**

Part of the [WindyWord.ai](https://windyword.ai) translation fleet — 1,800+ proprietary language pairs.

## Quality & Pricing Tier

- **5-star rating:** 2.5★ ⭐⭐½
- **Tier:** Basic
- **Composite score:** 51.7 / 100
- **Rated via:** Grand Rounds v2 — an 8-test stress battery (paragraphs, multi-paragraph, native input, domain stress, edge cases, round-trip fidelity, speed, and consistency checks)

## Available Variants

This repository contains multiple deployment formats. Pick the one that matches your use case:

| Variant | Description |
|---|---|
| `lora/` | Proprietary fog-of-mirror fork. Safe baseline, quality ≈ Helsinki-NLP original. |
| `lora-ct2-int8/` | CT2 INT8 quantized lora. ~25% of size, 2-4× faster CPU inference, no quality loss. |

### Quick usage

**Transformers (PyTorch):**
```python
from transformers import MarianMTModel, MarianTokenizer
tokenizer = MarianTokenizer.from_pretrained("WindyWord/translate-afa-afa", subfolder="lora")
model = MarianMTModel.from_pretrained("WindyWord/translate-afa-afa", subfolder="lora")
```

**CTranslate2 (fast CPU inference):**
```python
import ctranslate2
translator = ctranslate2.Translator("path/to/translate-afa-afa/lora-ct2-int8")
```

## Attribution

Derived from [Helsinki-NLP/opus-mt-afa-afa](https://huggingface.co/Helsinki-NLP/opus-mt-afa-afa) (Helsinki-NLP OPUS-MT project, CC-BY-4.0).

Proprietary variants created by the WindyWord.ai team:
- **lora/**: Fog-of-mirror LoRA fine-tune (r=4, α=8) — legally distinct, quality-preserved
- **herm0/**: OPUS-100/Tatoeba/WikiMatrix deep fine-tune (if available) — measurably improved
- **herm0-scripture/**: eBible verse-aligned fine-tune (for 292 scripture pairs)

## Commercial Use

The WindyWord.ai platform provides:
- **Mobile apps** (iOS, Android — coming soon)
- **Real-time voice-to-text-to-translation** pipeline
- **API access** with premium model quality
- **Offline deployment** support

Visit [windyword.ai](https://windyword.ai) for apps and commercial API access.

## License

CC-BY-4.0, inherited from upstream Helsinki-NLP. Attribution required.

---
*Certified by Opus 4.6 Opus-Claw (Dr. C) via WindyWord.ai quality assurance pipeline.*
*Patient file: [clinic record](https://github.com/sneakyfree/Windy-Clinic/blob/main/translation-pairs/afa-afa.json)*