Spaces:
No application file
No application file
File size: 3,620 Bytes
a529afb 7f550d4 a529afb 7f550d4 a529afb 7f550d4 a529afb 7f550d4 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 | ---
title: Draw Your Floorplan - ControlNet
emoji: π
colorFrom: blue
colorTo: green
sdk: gradio
sdk_version: 4.44.0
app_file: app.py
pinned: false
license: mit
models:
- Qistinasofea/controlnet-floorplan
- stable-diffusion-v1-5/stable-diffusion-v1-5
---
# π Draw Your Floorplan - ControlNet
**AI54 Final Project - Spatially Conditioned Floorplan Generation**
## π¨ Interactive Demo
This Space allows you to **draw colored segmentation masks** and generate architectural floorplans using a fine-tuned ControlNet model.
### How to Use:
1. **Draw** colored regions on the canvas - each color represents a different room type
2. **Describe** your floorplan in the text box
3. **Adjust** settings if needed (inference steps, control strength, seed)
4. Click **Generate Floorplan** to see your AI-generated layout!
### Suggested Colors:
- π΄ Red - Living room / Main spaces
- π’ Green - Bedrooms
- π΅ Blue - Bathrooms
- π‘ Yellow - Kitchen
- π£ Purple - Dining area
- π Orange - Office / Study
- π©΅ Cyan - Utility / Storage
---
## π Model Information
### Training Details:
- **Method:** Full ControlNet Fine-Tuning
- **Base Model:** Stable Diffusion 1.5 (frozen)
- **ControlNet:** Segmentation variant (fully trained)
- **Dataset:** 11,375 orientation-normalized floorplan samples
- **Parameters:** 361M trainable parameters (100% of ControlNet)
- **Training Steps:** 10,000
- **Final Loss:** 0.0887
- **Training Time:** 3.7 hours on T4 GPU
### Architecture:
The model uses a two-stage architecture:
1. **Base Model (SD 1.5):** Generates realistic textures and appearance (frozen weights)
2. **ControlNet:** Guides spatial structure based on colored segmentation input (fully fine-tuned)
This separation allows the model to:
- β
Preserve spatial layouts from user drawings
- β
Generate realistic architectural details
- β
Maintain consistent room boundaries
- β
Produce diverse outputs from the same layout
---
## π Links
- **Trained Model:** [Qistinasofea/controlnet-floorplan](https://huggingface.co/Qistinasofea/controlnet-floorplan)
- **Dataset:** [Qistinasofea/floorplan-12k-aligned](https://huggingface.co/datasets/Qistinasofea/floorplan-12k-aligned)
- **Training Notebook:** Available in model repository
---
## π Academic Context
This is a final project for **AI54: Artificial Intelligence course** focused on:
- Conditional image generation
- Spatial control in diffusion models
- ControlNet architecture and training
- Parameter-efficient fine-tuning considerations
- Real-world application development
### Key Contributions:
1. **Dataset Preprocessing:** Orientation normalization using PCA-based rotation alignment
2. **Training Strategy:** Full fine-tuning justified by dataset size (11,375 samples)
3. **User Interface:** Visual layout-driven interaction for non-technical users
---
## π» Technical Stack
- **Framework:** π€ Diffusers
- **Model:** ControlNet + Stable Diffusion 1.5
- **Interface:** Gradio
- **Deployment:** HuggingFace Spaces
- **Hardware:** GPU-enabled (T4 or better recommended)
---
## π Citation
If you use this model or approach in your work, please cite:
```
@misc{controlnet-floorplan-2024,
author = {Qistinasofea},
title = {ControlNet for Floorplan Generation},
year = {2024},
publisher = {HuggingFace},
howpublished = {\url{https://huggingface.co/Qistinasofea/controlnet-floorplan}}
}
```
---
## π License
This project is released under the MIT License. The base Stable Diffusion 1.5 model follows its original CreativeML Open RAIL-M license.
---
**Built with β€οΈ for AI54 Final Project**
|