Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
vikosik3000
/
FRED_text_normalization
like
0
Safetensors
t5
Model card
Files
Files and versions
xet
Community
main
FRED_text_normalization
3.29 GB
Ctrl+K
Ctrl+K
1 contributor
History:
32 commits
vikosik3000
Training in progress, step 30972
829ede5
verified
11 months ago
.gitattributes
Safe
1.52 kB
initial commit
11 months ago
added_tokens.json
Safe
2.72 kB
Training in progress, step 1000
11 months ago
config.json
Safe
807 Bytes
Training in progress, step 1000
11 months ago
merges.txt
Safe
1.27 MB
Training in progress, step 1000
11 months ago
model.safetensors
3.28 GB
xet
Training in progress, step 30972
11 months ago
special_tokens_map.json
Safe
688 Bytes
Training in progress, step 1000
11 months ago
tokenizer_config.json
Safe
20.3 kB
Training in progress, step 1000
11 months ago
training_args.bin
pickle
Detected Pickle imports (10)
"transformers.trainer_utils.HubStrategy"
,
"transformers.trainer_utils.SaveStrategy"
,
"transformers.trainer_utils.IntervalStrategy"
,
"torch.device"
,
"transformers.trainer_utils.SchedulerType"
,
"transformers.training_args.OptimizerNames"
,
"accelerate.utils.dataclasses.DistributedType"
,
"transformers.training_args_seq2seq.Seq2SeqTrainingArguments"
,
"accelerate.state.PartialState"
,
"transformers.trainer_pt_utils.AcceleratorConfig"
How to fix it?
5.43 kB
xet
Training in progress, step 1000
11 months ago
vocab.json
Safe
1.81 MB
Training in progress, step 1000
11 months ago