Instructions to use WindyWord/translate-pl-eo with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use WindyWord/translate-pl-eo with Transformers:
# Use a pipeline as a high-level helper # Warning: Pipeline type "translation" is no longer supported in transformers v5. # You must load the model directly (see below) or downgrade to v4.x with: # 'pip install "transformers<5.0.0' from transformers import pipeline pipe = pipeline("translation", model="WindyWord/translate-pl-eo")# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("WindyWord/translate-pl-eo", dtype="auto") - Notebooks
- Google Colab
- Kaggle
| license: cc-by-4.0 | |
| tags: | |
| - translation | |
| - marian | |
| - windyword | |
| - polish | |
| - esperanto | |
| language: | |
| - pl | |
| - eo | |
| library_name: transformers | |
| pipeline_tag: translation | |
| # WindyWord.ai Translation β Polish β Esperanto | |
| **Translates Polish β Esperanto.** | |
| **Quality Rating: β (Noneβ Deferred)** | |
| Part of the [WindyWord.ai](https://windyword.ai) translation fleet β 1,800+ proprietary language pairs. | |
| ## Quality & Pricing Tier | |
| - **5-star rating:** Noneβ β | |
| - **Tier:** Deferred | |
| - **Composite score:** None / 100 | |
| - **Rated via:** Grand Rounds v2 β an 8-test stress battery (paragraphs, multi-paragraph, native input, domain stress, edge cases, round-trip fidelity, speed, and consistency checks) | |
| ## Available Variants | |
| This repository contains multiple deployment formats. Pick the one that matches your use case: | |
| | Variant | Description | | |
| |---|---| | |
| | `lora/` | **WindyStandard** β our proprietary production baseline. Stable, reliable, optimized for GPU inference. | | |
| | `lora-ct2-int8/` | **WindyStandard Β· CPU INT8** β CTranslate2 quantized version of WindyStandard. ~25% of the size, 2β4Γ faster on CPU, no measurable quality loss. | | |
| | `herm0-scripture/` | **WindyScripture** β verse-aligned fine-tune on the eBible parallel corpus. Specialized for biblical text; not recommended for general translation. | | |
| | `scripture-ct2-int8/` | **WindyScripture Β· CPU INT8** β CTranslate2 quantized WindyScripture. | | |
| ### Quick usage | |
| **Transformers (PyTorch):** | |
| ```python | |
| from transformers import MarianMTModel, MarianTokenizer | |
| tokenizer = MarianTokenizer.from_pretrained("WindyWord/translate-pl-eo", subfolder="lora") | |
| model = MarianMTModel.from_pretrained("WindyWord/translate-pl-eo", subfolder="lora") | |
| ``` | |
| **CTranslate2 (fast CPU inference):** | |
| ```python | |
| import ctranslate2 | |
| translator = ctranslate2.Translator("path/to/translate-pl-eo/lora-ct2-int8") | |
| ``` | |
| ## Commercial Use | |
| The WindyWord.ai platform provides: | |
| - **Mobile apps** (iOS, Android β coming soon) | |
| - **Real-time voice-to-text-to-translation** pipeline | |
| - **API access** with premium model quality | |
| - **Offline deployment** support | |
| Visit [windyword.ai](https://windyword.ai) for apps and commercial API access. | |
| --- | |
| ## Provenance & License | |
| Weights derived from the OPUS-MT project ([Helsinki-NLP/opus-mt-pl-eo](https://huggingface.co/Helsinki-NLP/opus-mt-pl-eo)) under CC-BY-4.0. WindyStandard, WindyEnhanced, and WindyScripture variants are proprietary to WindyWord.ai, independently trained and quality-certified via our Grand Rounds v2 test battery. | |
| Licensed CC-BY-4.0 β attribution preserved as required. | |
| *Certified by Opus 4.6 Opus-Claw (Dr. C) on Veron-1 (RTX 5090).* | |
| *Patient file: [clinic record](https://github.com/sneakyfree/Windy-Clinic/blob/main/translation-pairs/pl-eo.json)* | |