---
title: README
emoji: 🌖
colorFrom: yellow
colorTo: indigo
sdk: static
pinned: false
---
# LiteRT Community
[LiteRT](https://ai.google.dev/edge/litert) is Google's on-device framework for high-performance ML & GenAI deployment on edge platforms. It is the improved successor to TensorFlow Lite. On this community page, you can find ready-to-run LiteRT models for a wide range of ML/AI tasks.
Within this ecosystem, [LiteRT-LM](https://github.com/google-ai-edge/LiteRT-LM/blob/main/README.md) specializes in cutting edge GenAI. Recognizing that LLMs now function as complex pipelines of related models rather than single standalone models, LiteRT-LM leverages LiteRT to deliver an optimized solution for running LLMs on-device.
Both LiteRT and LiteRT-LM can run on a variety of devices including Android, iOS, Windows, macOS, Linux and Web allow easy deployment and scaling across a diverse device landscape.
# Trying it Live
Not sure where to start? We recommend first trying our models in our [Google AI Edge Gallery app](https://play.google.com/store/apps/details?id=com.google.ai.edge.gallery&pli=1) on Android.
# Community Contributions
Are we missing your favorite model? You can convert and run [PyTorch](https://github.com/google-ai-edge/ai-edge-torch), [TensorFlow](https://ai.google.dev/edge/litert/models/convert_tf), or [JAX](https://ai.google.dev/edge/litert/models/convert_jax) models to the classic TFLite format using the LiteRT conversion and optimization tools. Or for LLMs, you can use the [LiteRT Torch Generative API](https://github.com/google-ai-edge/ai-edge-torch/tree/main/ai_edge_torch/generative). When your model is ready, join the LiteRT community org and upload the model here for others to try!