NVILA-AWQ / README.md
Louym's picture
Update README.md
6fae2d2 verified
|
raw
history blame
683 Bytes
metadata
license: apache-2.0

Here, we provide AWQ-quantized versions of the most popular NVILA models. These files help you seamlessly deploy TinyChat to unlock the full potential of NVILA and your hardware.

One-command demo to chat with quantized NVILA models via llm-awq (NVILA-8B as an example):

'''
cd llm-awq/tinychat
python nvila_demo.py --model-path PATH/TO/NVILA       \
    --quant_path NVILA-AWQ/NVILA-8B-w4-g128-awq-v2.pt  \
    --media PATH/TO/ANY/IMAGES/VIDEOS    \
    --act_scale_path NVILA-AWQ/NVILA-8B-smooth-scale.pt \
    --all --chunk --model_type nvila
'''