| # OpenLipSync | |
| This is a small repository containing all the required files to run inference for LatentSync1.5. | |
| ## Installation | |
| - Clone the repo | |
| - On Debian-based systems, run `bash debian_setup.sh` for both local and modal (remote inference) | |
| - For remote inference with Modal, you must first create the volume by running: | |
| ```bash | |
| uv run modal run scripts/modal_download_extras.py | |
| uv run modal run scripts/modal_download_models.py | |
| ``` | |
| ## Running Inference | |
| ### Local Inference | |
| Modify the `inference.py` file at the root of the directory (add the path of your video and file). Then run with: | |
| ```bash | |
| uv run inference.py | |
| ``` | |
| ### Remote Inference | |
| Modify the `modal_lipsync_inference.py` file at the root of the directory (add the path of your video and file). Then run with: | |
| ```bash | |
| uv run modal run modal_lipsync_inference.py | |
| ``` | |
| ### Remote Inference with FastAPI Endpoints | |
| Run: | |
| ```bash | |
| uv run modal run modal_lipsync_serve.py | |
| ``` | |
| ## TODO: | |
| - Add MuseTalk checkpoints | |
| - Add LatentSync16 checkpoints | |