Upload tokenizer
290a52e - last-checkpoint Training in progress, step 100, checkpoint
- runs Training in progress, step 100
- 1.52 kB initial commit
- 662 Bytes Training in progress, step 50
- 42 MB Training in progress, step 100
- 437 Bytes Upload tokenizer
- 1.8 MB Upload tokenizer
- 493 kB Upload tokenizer
- 1.41 kB Upload tokenizer
training_args.bin Detected Pickle imports (11)
- "accelerate.utils.dataclasses.DeepSpeedPlugin",
- "transformers.trainer_utils.IntervalStrategy",
- "torch.device",
- "transformers.training_args.TrainingArguments",
- "transformers.trainer_utils.SchedulerType",
- "accelerate.state.PartialState",
- "transformers.training_args.OptimizerNames",
- "transformers.integrations.deepspeed.HfTrainerDeepSpeedConfig",
- "torch.float16",
- "transformers.trainer_utils.HubStrategy",
- "accelerate.utils.dataclasses.DistributedType"
How to fix it?
6.33 kB Training in progress, step 50 - 2.65 kB Training in progress, step 50