Instructions to use genies-models/llama-7b-code_low_quality with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use genies-models/llama-7b-code_low_quality with PEFT:
from peft import PeftModel from transformers import AutoModelForSequenceClassification base_model = AutoModelForSequenceClassification.from_pretrained("models/llama-7b") model = PeftModel.from_pretrained(base_model, "genies-models/llama-7b-code_low_quality") - Notebooks
- Google Colab
- Kaggle
File size: 129 Bytes
f10ffef | 1 2 3 4 | version https://git-lfs.github.com/spec/v1
oid sha256:684f20e9454c14ccf30e09bdd8753645c7e29b479dd9ae571ca3ae39b68eccc1
size 5115
|