Instructions to use WindyWord/translate-run-es with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use WindyWord/translate-run-es with Transformers:
# Use a pipeline as a high-level helper # Warning: Pipeline type "translation" is no longer supported in transformers v5. # You must load the model directly (see below) or downgrade to v4.x with: # 'pip install "transformers<5.0.0' from transformers import pipeline pipe = pipeline("translation", model="WindyWord/translate-run-es")# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("WindyWord/translate-run-es", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Add lora-ct2-int8 variant
Browse files- lora-ct2-int8/config.json +10 -0
- lora-ct2-int8/model.bin +3 -0
- lora-ct2-int8/shared_vocabulary.json +0 -0
lora-ct2-int8/config.json
ADDED
|
@@ -0,0 +1,10 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"add_source_bos": false,
|
| 3 |
+
"add_source_eos": false,
|
| 4 |
+
"bos_token": "<s>",
|
| 5 |
+
"decoder_start_token": "</s>",
|
| 6 |
+
"eos_token": "</s>",
|
| 7 |
+
"layer_norm_epsilon": null,
|
| 8 |
+
"multi_query_attention": false,
|
| 9 |
+
"unk_token": "<unk>"
|
| 10 |
+
}
|
lora-ct2-int8/model.bin
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:b345a06768e08b4ce9a35e12afdb0c3c2febf3094b2e9186b31896e5cf8c1cf2
|
| 3 |
+
size 77779355
|
lora-ct2-int8/shared_vocabulary.json
ADDED
|
The diff for this file is too large to render.
See raw diff
|
|
|