ByteDream / HF_INTEGRATION_GUIDE.md
Enzo8930302's picture
Upload HF_INTEGRATION_GUIDE.md with huggingface_hub
0b38b66 verified
# Hugging Face Integration Guide
## Overview
Byte Dream now includes full Hugging Face Hub integration, allowing you to:
- Upload trained models to HF Hub with `push_to_hub()`
- Download and use models from HF Hub with `from_pretrained()`
- Load models directly in the generator using `hf_repo_id` parameter
- Deploy to Hugging Face Spaces easily
## Quick Start
### 1. Get Your Hugging Face Token
1. Go to https://huggingface.co/settings/tokens
2. Click "New token"
3. Give it a name (e.g., "Byte Dream")
4. Select "Write" permissions
5. Copy the token (starts with `hf_...`)
### 2. Upload Your Model to Hugging Face
After training your model:
```bash
# Method 1: Interactive (recommended)
python publish_to_hf.py
# You'll be prompted for:
# - Your HF token
# - Repository ID (e.g., Enzo8930302/ByteDream)
```
Or programmatically:
```python
from bytedream import ByteDreamGenerator
# Load your trained model
generator = ByteDreamGenerator(model_path="./models/bytedream")
# Upload to Hugging Face
generator.push_to_hub(
repo_id="your_username/ByteDream",
token="hf_xxxxxxxxxxxxx", # Your HF token
private=False, # Set True for private model
)
```
### 3. Use Model from Hugging Face
#### Python API
```python
from bytedream import ByteDreamGenerator
# Load directly from HF Hub
generator = ByteDreamGenerator(hf_repo_id="your_username/ByteDream")
# Generate image
image = generator.generate(
prompt="A beautiful sunset over mountains",
num_inference_steps=50,
guidance_scale=7.5
)
image.save("output.png")
```
#### Command Line
```bash
# Generate using model from HF
python infer.py \
--prompt "A dragon flying over castle" \
--hf_repo "your_username/ByteDream" \
--output dragon.png
```
#### Gradio Web Interface
```bash
# Set environment variable
export HF_REPO_ID=your_username/ByteDream
# Run app (will load from HF)
python app.py
```
## Detailed Usage
### Upload Methods
#### Method 1: publish_to_hf.py (Recommended)
```bash
python publish_to_hf.py [token] [repo_id]
# Examples:
python publish_to_hf.py
python publish_to_hf.py hf_xxxx Enzo8930302/ByteDream
```
#### Method 2: upload_to_hf.py
```bash
python upload_to_hf.py \
--model_path ./models/bytedream \
--repo_id your_username/ByteDream \
--token hf_xxxx \
--private # Optional: make repository private
```
#### Method 3: Python API
```python
from bytedream import ByteDreamGenerator
generator = ByteDreamGenerator(model_path="./models/bytedream")
generator.push_to_hub(
repo_id="your_username/ByteDream",
token="hf_xxxx",
private=False,
commit_message="Upload Byte Dream model v1.0"
)
```
### Download/Load Methods
#### Method 1: Generator with hf_repo_id
```python
from bytedream import ByteDreamGenerator
# Automatically downloads from HF
generator = ByteDreamGenerator(
hf_repo_id="your_username/ByteDream",
config_path="config.yaml",
device="cpu"
)
```
#### Method 2: Pipeline from_pretrained
```python
from bytedream.pipeline import ByteDreamPipeline
import torch
# Load pipeline directly from HF
pipeline = ByteDreamPipeline.from_pretrained(
"your_username/ByteDream",
device="cpu",
dtype=torch.float32
)
# Generate
result = pipeline(
prompt="Your prompt here",
num_inference_steps=50,
guidance_scale=7.5
)
result[0].save("output.png")
```
#### Method 3: Local Directory
```python
from bytedream.pipeline import ByteDreamPipeline
# First download manually or save locally
pipeline = ByteDreamPipeline.from_pretrained(
"./models/bytedream", # Local path
device="cpu"
)
```
## Deploy to Hugging Face Spaces
### Option 1: Manual Deployment
1. **Create Space**
- Go to https://huggingface.co/spaces
- Click "Create new Space"
- Choose Gradio SDK
- Select CPU hardware (Basic tier is free)
2. **Upload Files**
```bash
cd your_space_directory
git lfs install
git clone https://huggingface.co/spaces/your_username/your_space
cp -r ../Byte Dream/* your_space/
git add .
git commit -m "Initial commit"
git push
```
3. **Set Environment Variable**
- In your Space settings
- Add `HF_REPO_ID` variable with value `your_username/ByteDream`
4. **Deploy**
- The app will automatically deploy
- Available at: `https://huggingface.co/spaces/your_username/your_space`
### Option 2: Using Spaces SDK
```python
# In your Byte Dream directory
from huggingface_hub import HfApi
api = HfApi()
# Create and push space
api.create_repo(
repo_id="your_username/ByteDream-Space",
repo_type="space",
space_sdk="gradio",
token="hf_xxxx"
)
api.upload_folder(
folder_path=".",
repo_id="your_username/ByteDream-Space",
repo_type="space",
token="hf_xxxx"
)
```
## Configuration
### Environment Variables
```bash
# Load model from HF in app.py
export HF_REPO_ID=your_username/ByteDream
# Custom model path
export MODEL_PATH=./models/bytedream
```
### Model Files Structure
When uploaded to HF, your model will have this structure:
```
your_username/ByteDream/
├── unet/
│ └── pytorch_model.bin # UNet weights
├── vae/
│ └── pytorch_model.bin # VAE weights
├── scheduler/
│ └── scheduler_config.json # Scheduler config
├── model_index.json # Pipeline config
├── config.yaml # Full configuration
└── README.md # Model card
```
## Examples
### Example 1: Complete Workflow
```python
from bytedream import ByteDreamGenerator
# 1. Train model
# python train.py
# 2. Load trained model
generator = ByteDreamGenerator(model_path="./models/bytedream")
# 3. Test generation
image = generator.generate("Test prompt")
image.save("test.png")
# 4. Upload to HF
generator.push_to_hub(
repo_id="Enzo8930302/ByteDream",
token="hf_xxxx"
)
print("✓ Model uploaded!")
```
### Example 2: Use Community Models
```python
from bytedream import ByteDreamGenerator
# Load community model
generator = ByteDreamGenerator(
hf_repo_id="community-member/fantasy-model"
)
# Generate fantasy art
image = generator.generate(
prompt="Majestic dragon, fantasy landscape, dramatic lighting",
num_inference_steps=75,
guidance_scale=9.0
)
image.save("dragon.png")
```
### Example 3: Batch Processing
```python
from bytedream import ByteDreamGenerator
generator = ByteDreamGenerator(hf_repo_id="your_username/ByteDream")
prompts = [
"Sunset over mountains",
"Cyberpunk city at night",
"Fantasy castle in clouds",
"Underwater coral reef",
]
images = generator.generate_batch(
prompts=prompts,
width=512,
height=512,
num_inference_steps=50,
)
for i, img in enumerate(images):
img.save(f"image_{i}.png")
```
## Troubleshooting
### Error: "Repository not found"
**Solution**: Make sure the repository exists and is public, or you have proper authentication.
```python
# For private repos, provide token
generator = ByteDreamGenerator(
hf_repo_id="your_username/private-model",
config_path="config.yaml"
)
# Token should be configured in ~/.cache/huggingface/token
```
### Error: "Model not trained"
**Solution**: Train the model first or download pretrained weights.
```bash
# Train model
python train.py
# Or download from HF
python infer.py --hf_repo username/model --prompt "test"
```
### Error: "Out of memory"
**Solution**: Reduce image size or enable memory efficient mode.
```python
generator = ByteDreamGenerator(hf_repo_id="username/model")
generator.pipeline.enable_memory_efficient_mode()
image = generator.generate(
prompt="...",
width=256, # Smaller size
height=256,
)
```
## Best Practices
1. **Token Security**: Never commit your HF token to git
- Use environment variables
- Store in `~/.cache/huggingface/token`
2. **Model Versioning**: Use meaningful commit messages
```python
generator.push_to_hub(
repo_id="username/ByteDream",
commit_message="Add v2.0 with improved quality"
)
```
3. **Private Models**: For proprietary models
```python
generator.push_to_hub(
repo_id="username/private-model",
private=True
)
```
4. **Model Cards**: Include good README
- Describe training data
- Show example prompts
- List known limitations
## API Reference
### ByteDreamGenerator
```python
class ByteDreamGenerator:
def __init__(
self,
model_path: Optional[str] = None,
config_path: str = "config.yaml",
device: str = "cpu",
hf_repo_id: Optional[str] = None, # NEW!
)
def push_to_hub(
self,
repo_id: str,
token: Optional[str] = None,
private: bool = False,
commit_message: str = "Upload Byte Dream model",
)
def save_pretrained(self, save_directory: str)
```
### ByteDreamPipeline
```python
class ByteDreamPipeline:
@classmethod
def from_pretrained(
cls,
model_path: Union[str, Path], # Can be HF repo ID!
device: str = "cpu",
dtype: torch.dtype = torch.float32,
) -> "ByteDreamPipeline"
def save_pretrained(self, save_directory: Union[str, Path])
```
## Resources
- Hugging Face Hub: https://huggingface.co
- Documentation: https://huggingface.co/docs/hub
- Spaces: https://huggingface.co/spaces
- Token settings: https://huggingface.co/settings/tokens
## Support
For issues or questions:
1. Check this guide first
2. Review error messages carefully
3. Check Hugging Face documentation
4. Open GitHub issue
---
**Happy Generating! 🎨**