7B-DPO-alpha / config.json

Commit History

Upload folder using huggingface_hub
4122cdd

JosephusCheung commited on