How to use OpenGVLab/InternVideo2-Chat-8B with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("OpenGVLab/InternVideo2-Chat-8B", trust_remote_code=True, dtype="auto")
I ran into some problems while cloning the repo. I found a solution to it.
git clone https://<user_name>:<finegrained_token_with_gated_repo_access>@huggingface.co/OpenGVLab/InternVideo2-Chat-8B
Perhaps this can be updated here.
· Sign up or log in to comment