CelesteImperia: SDXL OpenVINO (Intel Optimized)

This repository contains the Vedic Cinematic engine optimized for the Intel AI PC ecosystem. Designed to run on everything from local Iris Xe graphics to the latest Intel AI Boost NPUs.

πŸš€ Performance Highlights

  • Verified Benchmark: 6:46 for 1024x1024 generation on standard consumer hardware.
  • Precision: Dual-Tier (FP16 for fidelity, INT8 for NPU-native speed).
  • Architecture: Fully compatible with OpenVINO 2024.x+.

πŸ“¦ Components

  • unet/: INT8 Quantized (NNCF Calibrated) and FP16 Master.
  • tinyvae/: OpenVINO IR format for instant image previews.
  • text_encoder/: Optimized for low-latency prompt parsing.

πŸ› οΈ Quick Start

from optimum.intel.openvino import OVStableDiffusionXLPipeline
pipe = OVStableDiffusionXLPipeline.from_pretrained("CelesteImperia/SDXL-OpenVINO")
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for CelesteImperia/SDXL-OpenVINO

Finetuned
(1165)
this model