Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
yichaodu
/
DiffusionDPO-alignment-gpt-4o
like
0
Text-to-Image
Diffusers
Safetensors
stable-diffusion
stable-diffusion-diffusers
DPO
DiffusionDPO
arxiv:
2407.04842
Model card
Files
Files and versions
xet
Community
1
Use this model
main
DiffusionDPO-alignment-gpt-4o
/
README.md
Commit History
Upload README.md with huggingface_hub
fa19e4a
verified
yichaodu
commited on
Jul 9, 2024
Upload README.md with huggingface_hub
6cfda75
verified
yichaodu
commited on
Jun 20, 2024
Upload README.md with huggingface_hub
2722cd1
verified
yichaodu
commited on
Jun 20, 2024
Upload README.md with huggingface_hub
fa4c451
verified
yichaodu
commited on
Jun 19, 2024