Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
pumad
/
pumadic-en-pl
like
0
Translation
Safetensors
opus100
europarl_bilingual
un_pc
English
Polish
marian
nmt
encoder-decoder
from-scratch
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
Deploy
main
pumadic-en-pl
631 MB
1 contributor
History:
9 commits
pumad
Fix generation_config.json - add num_beams=4
f9c9dc2
verified
11 days ago
.gitattributes
Safe
1.61 kB
Upload folder using huggingface_hub
12 days ago
README.md
Safe
4.06 kB
Update README.md
11 days ago
config.json
Safe
842 Bytes
Upload folder using huggingface_hub
12 days ago
generation_config.json
275 Bytes
Fix generation_config.json - add num_beams=4
11 days ago
handler.py
Safe
773 Bytes
Upload handler.py with huggingface_hub
12 days ago
model.safetensors
628 MB
xet
Update model with diverse fine-tuning (104k samples: OPUS-100 + idioms + books)
11 days ago
source.spm
826 kB
xet
Upload folder using huggingface_hub
12 days ago
special_tokens_map.json
Safe
74 Bytes
Update model with diverse fine-tuning (104k samples: OPUS-100 + idioms + books)
11 days ago
spm.model
826 kB
xet
Upload folder using huggingface_hub
12 days ago
target.spm
826 kB
xet
Upload folder using huggingface_hub
12 days ago
tokenizer_config.json
Safe
843 Bytes
Update model with diverse fine-tuning (104k samples: OPUS-100 + idioms + books)
11 days ago
training_args.bin
pickle
Detected Pickle imports (10)
"transformers.training_args.OptimizerNames"
,
"transformers.trainer_utils.HubStrategy"
,
"transformers.trainer_utils.IntervalStrategy"
,
"accelerate.state.PartialState"
,
"accelerate.utils.dataclasses.DistributedType"
,
"torch.device"
,
"transformers.trainer_utils.SchedulerType"
,
"transformers.trainer_pt_utils.AcceleratorConfig"
,
"transformers.training_args_seq2seq.Seq2SeqTrainingArguments"
,
"transformers.trainer_utils.SaveStrategy"
How to fix it?
5.97 kB
xet
Update model with diverse fine-tuning (104k samples: OPUS-100 + idioms + books)
11 days ago
vocab.json
Safe
819 kB
Update model with diverse fine-tuning (104k samples: OPUS-100 + idioms + books)
11 days ago