| `.npy` files under this folder are downloaded from TFLite model of DeepLab V3 MoblieNet V2 generated by following commands, which is licensed under the Apache 2.0. | |
| ### How to Generate DeepLab V3 MobileNet V2 TFLite model | |
| Download and extract the origin [DeepLab V3 MobileNet V2 model](http://download.tensorflow.org/models/deeplabv3_mnv2_pascal_trainval_2018_01_29.tar.gz) | |
| use following tools and commands to convert the frozen grahp to TFLite model. | |
| Use tensorflow's [`optimize_for_inference`](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/tools/optimize_for_inference.py) tool to generate a stripped frozen graph: | |
| ``` | |
| python3 -m tensorflow.python.tools.optimize_for_inference \ | |
| --input=./frozen_inference_graph.pb \ | |
| --output=./frozen_inference_graph_stripped.pb \ | |
| --frozen_graph=True \ | |
| --input_names="sub_7" \ | |
| --output_names="ArgMax" | |
| ``` | |
| Use ['TensorFlow Lite converter'](https://www.tensorflow.org/lite/convert) tool to convert stripped frozen graph to TFLite model: | |
| ``` | |
| tflite_convert \ | |
| --graph_def_file=frozen_inference_graph_stripped.pb \ | |
| --output_file=deeplab_mobilenetv2.tflite \ | |
| --output_format=TFLITE \ | |
| --input_format=TENSORFLOW_GRAPHDEF \ | |
| --input_arrays=sub_7 \ | |
| --output_arrays=ArgMax | |
| ``` |