Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Datasets:
mrcuddle
/
SD-Prompt-DPO
like
0
Tasks:
Text Generation
Modalities:
Text
Formats:
json
Size:
1K - 10K
Tags:
dpo
Libraries:
Datasets
pandas
Croissant
+ 1
Dataset card
Data Studio
Files
Files and versions
xet
Community
2
main
SD-Prompt-DPO
/
README.md
Commit History
Create README.md
f9f78c7
verified
mrcuddle
commited on
Feb 19, 2025