YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
TimeChat: Docker Environment Setup (CUDA 12.1, Ubuntu 22.04, Python 3.10, PyTorch 2.1.2)
π Overview
This project provides a Dockerized environment to run the Python-based TimeChat application with GPU acceleration, using:
- CUDA 12.1
- Ubuntu 22.04
- Python 3.10
- PyTorch 2.1.2
Original repository:
https://github.com/RenShuhuai-Andy/TimeChat
π§± Prerequisites
- Docker and NVIDIA Container Toolkit must be installed
- A Linux machine with a compatible NVIDIA GPU and drivers
TimeChatandckptdirectories must be present in the current working directory
π₯ Clone the Repository
Clone the required repository and initialize Git LFS:
git lfs install
git clone https://huggingface.co/Bio-sensing/video_captioning_ubicomp_student_challenge
Make sure the TimeChat directory is properly set up.
π¨ Build the Docker Image
Build the Docker image using the following command:
docker build -t cuda121_ubuntu2204_python310_torch212 -f Dockerfile_cuda121_ubuntu2204_python310_torch212 .
π Run the Docker Container
Run the container with GPU access and volume mounting:
docker run -it --name "timechat" --gpus all \
-v $(pwd)/TimeChat:/home/TimeChat \
-v $(pwd)/ckpt:/home/ckpt/ \
-p 12354:12354 -p 12355:12355 \
cuda121_ubuntu2204_python310_torch212 bash
π» Launch Jupyter Lab
Inside the container, execute the following:
cd /home/
jupyter-lab --ip 0.0.0.0 --port=12354 --allow-root --no-browser --ContentsManager.allow_hidden=True &
Then, open the following URL in your browser on the host machine:
http://localhost:12354
You can verify the environment and functionality by opening and running demo.ipynb .
Inference Providers NEW
This model isn't deployed by any Inference Provider. π Ask for provider support