Instructions to use WindstormLabs/translate-es-lua with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use WindstormLabs/translate-es-lua with Transformers:
# Use a pipeline as a high-level helper # Warning: Pipeline type "translation" is no longer supported in transformers v5. # You must load the model directly (see below) or downgrade to v4.x with: # 'pip install "transformers<5.0.0' from transformers import pipeline pipe = pipeline("translation", model="WindstormLabs/translate-es-lua")# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("WindstormLabs/translate-es-lua", dtype="auto") - Notebooks
- Google Colab
- Kaggle
| license: cc-by-4.0 | |
| tags: | |
| - translation | |
| - marian | |
| - windyword | |
| - spanish | |
| - luba-lulua | |
| language: | |
| - es | |
| - lua | |
| library_name: transformers | |
| pipeline_tag: translation | |
| # WindyWord.ai Translation β Spanish β Luba-Lulua | |
| **Translates Spanish β Luba-Lulua.** | |
| **Quality Rating: ββΒ½ (2.5β Basic)** | |
| Part of the [WindyWord.ai](https://windyword.ai) translation fleet β 1,800+ proprietary language pairs. | |
| ## Quality & Pricing Tier | |
| - **5-star rating:** 2.5β ββΒ½ | |
| - **Tier:** Basic | |
| - **Composite score:** 51.1 / 100 | |
| - **Rated via:** Grand Rounds v2 β an 8-test stress battery (paragraphs, multi-paragraph, native input, domain stress, edge cases, round-trip fidelity, speed, and consistency checks) | |
| ## Available Variants | |
| This repository contains multiple deployment formats. Pick the one that matches your use case: | |
| | Variant | Description | | |
| |---|---| | |
| | `lora/` | **WindyStandard** β our proprietary production baseline. Stable, reliable, optimized for GPU inference. | | |
| | `lora-ct2-int8/` | **WindyStandard Β· CPU INT8** β CTranslate2 quantized version of WindyStandard. ~25% of the size, 2β4Γ faster on CPU, no measurable quality loss. | | |
| ### Quick usage | |
| **Transformers (PyTorch):** | |
| ```python | |
| from transformers import MarianMTModel, MarianTokenizer | |
| tokenizer = MarianTokenizer.from_pretrained("WindyWord/translate-es-lua", subfolder="lora") | |
| model = MarianMTModel.from_pretrained("WindyWord/translate-es-lua", subfolder="lora") | |
| ``` | |
| **CTranslate2 (fast CPU inference):** | |
| ```python | |
| import ctranslate2 | |
| translator = ctranslate2.Translator("path/to/translate-es-lua/lora-ct2-int8") | |
| ``` | |
| ## Commercial Use | |
| The WindyWord.ai platform provides: | |
| - **Mobile apps** (iOS, Android β coming soon) | |
| - **Real-time voice-to-text-to-translation** pipeline | |
| - **API access** with premium model quality | |
| - **Offline deployment** support | |
| Visit [windyword.ai](https://windyword.ai) for apps and commercial API access. | |
| --- | |
| ## Provenance & License | |
| Weights derived from the OPUS-MT project ([Helsinki-NLP/opus-mt-es-lua](https://huggingface.co/Helsinki-NLP/opus-mt-es-lua)) under CC-BY-4.0. WindyStandard, WindyEnhanced, and WindyScripture variants are proprietary to WindyWord.ai, independently trained and quality-certified via our Grand Rounds v2 test battery. | |
| Licensed CC-BY-4.0 β attribution preserved as required. | |
| *Certified by Opus 4.6 Opus-Claw (Dr. C) on Veron-1 (RTX 5090).* | |
| *Patient file: [clinic record](https://github.com/sneakyfree/Windy-Clinic/blob/main/translation-pairs/es-lua.json)* | |