Torq Models Sources
Collection
Compiler inputs used to generate models in the Torq Models collection • 7 items • Updated
This model - MobileNetV2 is generated from tf.keras.applications
using tf_model_generator.py.
The dataset for int8 quantization is done using random data.
Available formats: int16, int8, float32, float16
The kaggle.tflite model is the original quantized model
published by Google. This model uses v1 tflite quantization.
# Available backend options are: "jax", "torch", "tensorflow". import os os.environ["KERAS_BACKEND"] = "jax" import keras model = keras.saving.load_model("hf://Synaptics/MobileNetV2")