Models from https://huggingface.co/nRuaif/Prototype-L2
Only gptq-4bit-32g-actorder_True is supported for now
in oobabooga you can use it with python download-model.py IHaBiS/Prototype-L2-GPTQ --branch gptq-4bit-32g-actorder_True code.
Models from https://huggingface.co/nRuaif/Prototype-L2
Only gptq-4bit-32g-actorder_True is supported for now
in oobabooga you can use it with python download-model.py IHaBiS/Prototype-L2-GPTQ --branch gptq-4bit-32g-actorder_True code.