How to use Nutanix/CodeLlama-7b-Instruct-hf_cpp_unit_tests_lora_8_alpha_16_class_level with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("Nutanix/CodeLlama-7b-Instruct-hf_cpp_unit_tests_lora_8_alpha_16_class_level", dtype="auto")
The community tab is the place to discuss and collaborate with the HF community!