Commit
·
15865ff
1
Parent(s):
2f3694a
Update README.md
Browse files
README.md
CHANGED
|
@@ -37,4 +37,26 @@ eval_tokenizer = AutoTokenizer.from_pretrained(
|
|
| 37 |
)
|
| 38 |
```
|
| 39 |
Now load the QLoRA adapter from the appropriate checkpoint directory
|
| 40 |
-
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 37 |
)
|
| 38 |
```
|
| 39 |
Now load the QLoRA adapter from the appropriate checkpoint directory
|
| 40 |
+
```
|
| 41 |
+
from peft import PeftModel
|
| 42 |
+
|
| 43 |
+
ft_model = PeftModel.from_pretrained(base_model, "mistral-viggo-finetune/checkpoint-950")
|
| 44 |
+
```
|
| 45 |
+
Let's try the same eval_prompt and thus model_input as above, and see if the new finetuned model performs better.
|
| 46 |
+
```
|
| 47 |
+
eval_prompt = """Given a target sentence construct the underlying meaning representation of the input sentence as a single function with attributes and attribute values.
|
| 48 |
+
This function should describe the target string accurately and the function must be one of the following ['inform', 'request', 'give_opinion', 'confirm', 'verify_attribute', 'suggest', 'request_explanation', 'recommend', 'request_attribute'].
|
| 49 |
+
The attributes must be one of the following: ['name', 'exp_release_date', 'release_year', 'developer', 'esrb', 'rating', 'genres', 'player_perspective', 'has_multiplayer', 'platforms', 'available_on_steam', 'has_linux_release', 'has_mac_release', 'specifier']
|
| 50 |
+
|
| 51 |
+
### Target sentence:
|
| 52 |
+
Earlier, you stated that you didn't have strong feelings about PlayStation's Little Big Adventure. Is your opinion true for all games which don't have multiplayer?
|
| 53 |
+
|
| 54 |
+
### Meaning representation:
|
| 55 |
+
"""
|
| 56 |
+
|
| 57 |
+
model_input = tokenizer(eval_prompt, return_tensors="pt").to("cuda")
|
| 58 |
+
|
| 59 |
+
ft_model.eval()
|
| 60 |
+
with torch.no_grad():
|
| 61 |
+
print(eval_tokenizer.decode(ft_model.generate(**model_input, max_new_tokens=100)[0], skip_special_tokens=True))
|
| 62 |
+
```
|