Update README.md
Browse filesThis repository contains public models of [Latent Preference Optimization (LPO)](https://github.com/Kwai-Kolors/LPO) based on SD1.5 and SDXL.
This repository contains public models of [Latent Preference Optimization (LPO)](https://github.com/Kwai-Kolors/LPO) based on SD1.5 and SDXL.