DPO adapter
6028948 verified - dpo End of training
- 1.52 kB initial commit
- 2.54 kB DPO adapter
- 869 Bytes Training in progress, step 500
- 9.02 MB Training in progress, step 500
- 410 Bytes Training in progress, epoch 0
- 437 Bytes DPO adapter
- 3.62 MB Training in progress, epoch 0
- 500 kB Training in progress, epoch 0
- 951 Bytes Training in progress, epoch 0
training_args.bin Detected Pickle imports (11)
- "accelerate.state.PartialState",
- "transformers.trainer_utils.IntervalStrategy",
- "transformers.trainer_utils.SchedulerType",
- "trl.trainer.dpo_config.FDivergenceType",
- "transformers.trainer_utils.HubStrategy",
- "transformers.trainer_utils.SaveStrategy",
- "trl.trainer.dpo_config.DPOConfig",
- "transformers.training_args.OptimizerNames",
- "torch.device",
- "transformers.trainer_pt_utils.AcceleratorConfig",
- "accelerate.utils.dataclasses.DistributedType"
How to fix it?
6.33 kB End of training