chansung/merged_ds_coding
Viewer • Updated • 60.6k • 133 • 18
How to use chansung/coding_llamaduo_result2 with PEFT:
from peft import PeftModel
from transformers import AutoModelForCausalLM
base_model = AutoModelForCausalLM.from_pretrained("google/gemma-7b")
model = PeftModel.from_pretrained(base_model, "chansung/coding_llamaduo_result2")This model is a fine-tuned version of google/gemma-7b on the chansung/merged_ds_coding dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 0.8192 | 1.0 | 122 | 1.2006 |
| 0.6377 | 2.0 | 245 | 1.1304 |
| 0.5334 | 3.0 | 367 | 1.1456 |
| 0.4454 | 4.0 | 490 | 1.1935 |
| 0.408 | 4.98 | 610 | 1.2247 |
Base model
google/gemma-7b
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("google/gemma-7b") model = PeftModel.from_pretrained(base_model, "chansung/coding_llamaduo_result2")