Instructions to use PragmaticMachineLearning/llama2-coqa-instruct with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use PragmaticMachineLearning/llama2-coqa-instruct with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("meta-llama/Llama-2-7b-hf") model = PeftModel.from_pretrained(base_model, "PragmaticMachineLearning/llama2-coqa-instruct") - Notebooks
- Google Colab
- Kaggle
Commit ·
e4d5385
1
Parent(s): ead0471
Update adapter_config.json
Browse files- adapter_config.json +1 -1
adapter_config.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
| 1 |
{
|
| 2 |
"auto_mapping": null,
|
| 3 |
-
"base_model_name_or_path": "meta-llama/Llama-2-7b",
|
| 4 |
"bias": "none",
|
| 5 |
"fan_in_fan_out": false,
|
| 6 |
"inference_mode": true,
|
|
|
|
| 1 |
{
|
| 2 |
"auto_mapping": null,
|
| 3 |
+
"base_model_name_or_path": "meta-llama/Llama-2-7b-hf",
|
| 4 |
"bias": "none",
|
| 5 |
"fan_in_fan_out": false,
|
| 6 |
"inference_mode": true,
|