Qwen-Python-DPO-Adapter / adapter_config.json

Commit History

Upload model trained with Unsloth
72d6ca2
verified

AhmadHatam commited on