File size: 173 Bytes
6ffba01 |
1 2 3 4 5 6 7 8 9 |
{
"model_name": "VQ-16",
"image_size": 512,
"downsample_size": 16,
"n_q": 8,
"codebook_size": 16384,
"codebook_embed_dim": 8,
"latent_channels": 8
} |