Hugging Face Integration Guide
Overview
Byte Dream now includes full Hugging Face Hub integration, allowing you to:
- Upload trained models to HF Hub with
push_to_hub() - Download and use models from HF Hub with
from_pretrained() - Load models directly in the generator using
hf_repo_idparameter - Deploy to Hugging Face Spaces easily
Quick Start
1. Get Your Hugging Face Token
- Go to https://huggingface.co/settings/tokens
- Click "New token"
- Give it a name (e.g., "Byte Dream")
- Select "Write" permissions
- Copy the token (starts with
hf_...)
2. Upload Your Model to Hugging Face
After training your model:
# Method 1: Interactive (recommended)
python publish_to_hf.py
# You'll be prompted for:
# - Your HF token
# - Repository ID (e.g., Enzo8930302/ByteDream)
Or programmatically:
from bytedream import ByteDreamGenerator
# Load your trained model
generator = ByteDreamGenerator(model_path="./models/bytedream")
# Upload to Hugging Face
generator.push_to_hub(
repo_id="your_username/ByteDream",
token="hf_xxxxxxxxxxxxx", # Your HF token
private=False, # Set True for private model
)
3. Use Model from Hugging Face
Python API
from bytedream import ByteDreamGenerator
# Load directly from HF Hub
generator = ByteDreamGenerator(hf_repo_id="your_username/ByteDream")
# Generate image
image = generator.generate(
prompt="A beautiful sunset over mountains",
num_inference_steps=50,
guidance_scale=7.5
)
image.save("output.png")
Command Line
# Generate using model from HF
python infer.py \
--prompt "A dragon flying over castle" \
--hf_repo "your_username/ByteDream" \
--output dragon.png
Gradio Web Interface
# Set environment variable
export HF_REPO_ID=your_username/ByteDream
# Run app (will load from HF)
python app.py
Detailed Usage
Upload Methods
Method 1: publish_to_hf.py (Recommended)
python publish_to_hf.py [token] [repo_id]
# Examples:
python publish_to_hf.py
python publish_to_hf.py hf_xxxx Enzo8930302/ByteDream
Method 2: upload_to_hf.py
python upload_to_hf.py \
--model_path ./models/bytedream \
--repo_id your_username/ByteDream \
--token hf_xxxx \
--private # Optional: make repository private
Method 3: Python API
from bytedream import ByteDreamGenerator
generator = ByteDreamGenerator(model_path="./models/bytedream")
generator.push_to_hub(
repo_id="your_username/ByteDream",
token="hf_xxxx",
private=False,
commit_message="Upload Byte Dream model v1.0"
)
Download/Load Methods
Method 1: Generator with hf_repo_id
from bytedream import ByteDreamGenerator
# Automatically downloads from HF
generator = ByteDreamGenerator(
hf_repo_id="your_username/ByteDream",
config_path="config.yaml",
device="cpu"
)
Method 2: Pipeline from_pretrained
from bytedream.pipeline import ByteDreamPipeline
import torch
# Load pipeline directly from HF
pipeline = ByteDreamPipeline.from_pretrained(
"your_username/ByteDream",
device="cpu",
dtype=torch.float32
)
# Generate
result = pipeline(
prompt="Your prompt here",
num_inference_steps=50,
guidance_scale=7.5
)
result[0].save("output.png")
Method 3: Local Directory
from bytedream.pipeline import ByteDreamPipeline
# First download manually or save locally
pipeline = ByteDreamPipeline.from_pretrained(
"./models/bytedream", # Local path
device="cpu"
)
Deploy to Hugging Face Spaces
Option 1: Manual Deployment
Create Space
- Go to https://huggingface.co/spaces
- Click "Create new Space"
- Choose Gradio SDK
- Select CPU hardware (Basic tier is free)
Upload Files
cd your_space_directory git lfs install git clone https://huggingface.co/spaces/your_username/your_space cp -r ../Byte Dream/* your_space/ git add . git commit -m "Initial commit" git pushSet Environment Variable
- In your Space settings
- Add
HF_REPO_IDvariable with valueyour_username/ByteDream
Deploy
- The app will automatically deploy
- Available at:
https://huggingface.co/spaces/your_username/your_space
Option 2: Using Spaces SDK
# In your Byte Dream directory
from huggingface_hub import HfApi
api = HfApi()
# Create and push space
api.create_repo(
repo_id="your_username/ByteDream-Space",
repo_type="space",
space_sdk="gradio",
token="hf_xxxx"
)
api.upload_folder(
folder_path=".",
repo_id="your_username/ByteDream-Space",
repo_type="space",
token="hf_xxxx"
)
Configuration
Environment Variables
# Load model from HF in app.py
export HF_REPO_ID=your_username/ByteDream
# Custom model path
export MODEL_PATH=./models/bytedream
Model Files Structure
When uploaded to HF, your model will have this structure:
your_username/ByteDream/
βββ unet/
β βββ pytorch_model.bin # UNet weights
βββ vae/
β βββ pytorch_model.bin # VAE weights
βββ scheduler/
β βββ scheduler_config.json # Scheduler config
βββ model_index.json # Pipeline config
βββ config.yaml # Full configuration
βββ README.md # Model card
Examples
Example 1: Complete Workflow
from bytedream import ByteDreamGenerator
# 1. Train model
# python train.py
# 2. Load trained model
generator = ByteDreamGenerator(model_path="./models/bytedream")
# 3. Test generation
image = generator.generate("Test prompt")
image.save("test.png")
# 4. Upload to HF
generator.push_to_hub(
repo_id="Enzo8930302/ByteDream",
token="hf_xxxx"
)
print("β Model uploaded!")
Example 2: Use Community Models
from bytedream import ByteDreamGenerator
# Load community model
generator = ByteDreamGenerator(
hf_repo_id="community-member/fantasy-model"
)
# Generate fantasy art
image = generator.generate(
prompt="Majestic dragon, fantasy landscape, dramatic lighting",
num_inference_steps=75,
guidance_scale=9.0
)
image.save("dragon.png")
Example 3: Batch Processing
from bytedream import ByteDreamGenerator
generator = ByteDreamGenerator(hf_repo_id="your_username/ByteDream")
prompts = [
"Sunset over mountains",
"Cyberpunk city at night",
"Fantasy castle in clouds",
"Underwater coral reef",
]
images = generator.generate_batch(
prompts=prompts,
width=512,
height=512,
num_inference_steps=50,
)
for i, img in enumerate(images):
img.save(f"image_{i}.png")
Troubleshooting
Error: "Repository not found"
Solution: Make sure the repository exists and is public, or you have proper authentication.
# For private repos, provide token
generator = ByteDreamGenerator(
hf_repo_id="your_username/private-model",
config_path="config.yaml"
)
# Token should be configured in ~/.cache/huggingface/token
Error: "Model not trained"
Solution: Train the model first or download pretrained weights.
# Train model
python train.py
# Or download from HF
python infer.py --hf_repo username/model --prompt "test"
Error: "Out of memory"
Solution: Reduce image size or enable memory efficient mode.
generator = ByteDreamGenerator(hf_repo_id="username/model")
generator.pipeline.enable_memory_efficient_mode()
image = generator.generate(
prompt="...",
width=256, # Smaller size
height=256,
)
Best Practices
Token Security: Never commit your HF token to git
- Use environment variables
- Store in
~/.cache/huggingface/token
Model Versioning: Use meaningful commit messages
generator.push_to_hub( repo_id="username/ByteDream", commit_message="Add v2.0 with improved quality" )Private Models: For proprietary models
generator.push_to_hub( repo_id="username/private-model", private=True )Model Cards: Include good README
- Describe training data
- Show example prompts
- List known limitations
API Reference
ByteDreamGenerator
class ByteDreamGenerator:
def __init__(
self,
model_path: Optional[str] = None,
config_path: str = "config.yaml",
device: str = "cpu",
hf_repo_id: Optional[str] = None, # NEW!
)
def push_to_hub(
self,
repo_id: str,
token: Optional[str] = None,
private: bool = False,
commit_message: str = "Upload Byte Dream model",
)
def save_pretrained(self, save_directory: str)
ByteDreamPipeline
class ByteDreamPipeline:
@classmethod
def from_pretrained(
cls,
model_path: Union[str, Path], # Can be HF repo ID!
device: str = "cpu",
dtype: torch.dtype = torch.float32,
) -> "ByteDreamPipeline"
def save_pretrained(self, save_directory: Union[str, Path])
Resources
- Hugging Face Hub: https://huggingface.co
- Documentation: https://huggingface.co/docs/hub
- Spaces: https://huggingface.co/spaces
- Token settings: https://huggingface.co/settings/tokens
Support
For issues or questions:
- Check this guide first
- Review error messages carefully
- Check Hugging Face documentation
- Open GitHub issue
Happy Generating! π¨