Instructions to use genies-models/llama-7b-code_low_quality with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use genies-models/llama-7b-code_low_quality with PEFT:
from peft import PeftModel from transformers import AutoModelForSequenceClassification base_model = AutoModelForSequenceClassification.from_pretrained("models/llama-7b") model = PeftModel.from_pretrained(base_model, "genies-models/llama-7b-code_low_quality") - Notebooks
- Google Colab
- Kaggle
File size: 134 Bytes
f10ffef | 1 2 3 4 | version https://git-lfs.github.com/spec/v1
oid sha256:00daa60a5692373b3c600c9e2c9ca527fbdd24224cd99801752d367547122e6a
size 134295057
|