Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
EmilyNguyen235
/
ddpo-alignment-ppo
like
0
Text-to-Image
Diffusers
StableDiffusionPipeline
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
9bd6eb3
ddpo-alignment-ppo
5.48 GB
Ctrl+K
Ctrl+K
1 contributor
History:
1 commit
trungdangtapcode
Initial commit
9bd6eb3
5 months ago
feature_extractor
Initial commit
5 months ago
safety_checker
Initial commit
5 months ago
scheduler
Initial commit
5 months ago
text_encoder
Initial commit
5 months ago
tokenizer
Initial commit
5 months ago
unet
Initial commit
5 months ago
vae
Initial commit
5 months ago
.gitattributes
Safe
1.48 kB
Initial commit
5 months ago
README.md
Safe
1.57 kB
Initial commit
5 months ago
model_index.json
579 Bytes
Initial commit
5 months ago