Instructions to use AmrBelal021/CodeGuard-7B-v1 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use AmrBelal021/CodeGuard-7B-v1 with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("unsloth/meta-llama-3.1-8b-instruct-bnb-4bit") model = PeftModel.from_pretrained(base_model, "AmrBelal021/CodeGuard-7B-v1") - Notebooks
- Google Colab
- Kaggle
base_model: unsloth/meta-llama-3.1-8b-instruct-bnb-4bit tags: - text-generation-inference - transformers - unsloth - llama - trl license: apache-2.0 language: - en
Uploaded model
- Developed by: AmrBelal021
- License: apache-2.0
- Finetuned from model : unsloth/meta-llama-3.1-8b-instruct-bnb-4bit
This llama model was trained 2x faster with Unsloth
- Downloads last month
- -
