How to use MBZUAI/Video-ChatGPT-7B with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("visual-question-answering", model="MBZUAI/Video-ChatGPT-7B")
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("MBZUAI/Video-ChatGPT-7B", dtype="auto")
Anyone know how to successfully deploy this model to AWS or HF? Every fork clearly isn't working as this project is missing something. Help?
· Sign up or log in to comment