Instructions to use keras-sd/diffusion-model-tflite with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Keras
How to use keras-sd/diffusion-model-tflite with Keras:
# Available backend options are: "jax", "torch", "tensorflow". import os os.environ["KERAS_BACKEND"] = "jax" import keras model = keras.saving.load_model("hf://keras-sd/diffusion-model-tflite") - Notebooks
- Google Colab
- Kaggle
This repository hosts the TFLite version of diffusion model part of KerasCV Stable Diffusion.
Stable Diffusion consists of text encoder, diffusion model, decoder, and some glue codes to handl inputs and outputs of each part. The TFLite version of diffusion model in this repository is built not only with the diffusion model itself but also TensorFlow operations that takes context, unconditional context from text encoder and generates latent. The latent output should be passed down to the decoder which is hosted in this repository.
TFLite conversion was based on the SavedModel from this repository, and TensorFlow version >= 2.12-nightly was used.
- NOTE: Dynamic range quantization was used.
- NOTE: TensorFlow version
< 2.12-nightlywill fail for the conversion process. - NOTE: For those who wonder how
SavedModelis constructed, find it in keras-sd-serving repository.
- Downloads last month
- 51
# Available backend options are: "jax", "torch", "tensorflow". import os os.environ["KERAS_BACKEND"] = "jax" import keras model = keras.saving.load_model("hf://keras-sd/diffusion-model-tflite")