tiny-flux-deep / config.json
AbstractPhil's picture
Upload config.json with huggingface_hub
ad8b8b9 verified
raw
history blame contribute delete
315 Bytes
{
"hidden_size": 512,
"num_attention_heads": 4,
"attention_head_dim": 128,
"in_channels": 16,
"joint_attention_dim": 768,
"pooled_projection_dim": 768,
"num_double_layers": 15,
"num_single_layers": 25,
"mlp_ratio": 4.0,
"axes_dims_rope": [
16,
56,
56
],
"guidance_embeds": true
}