jtowarek's picture
Upload folder using huggingface_hub
f7e2ae6 verified
"""Training-related constants for GRPO and DPO pipelines."""