redscroll commited on
Commit
00713fd
·
verified ·
1 Parent(s): 6aa1e1c

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +22 -0
README.md ADDED
@@ -0,0 +1,22 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ pipeline_tag: text-generation
3
+ ---
4
+ Not my model(obviously); downloaded the Mistral release model from https://models.mistralcdn.com/mistral-7b-v0-2/mistral-7B-v0.2.tar and uploaded for my own sanity(and fine-tuning), since it's still not uploaded on Mistral repo.
5
+
6
+ The standard code works:
7
+
8
+ ```python
9
+ from transformers import AutoTokenizer, AutoModelForCausalLM
10
+ import torch
11
+
12
+ model = AutoModelForCausalLM.from_pretrained("redscroll/Mistral-7B-v0.2", torch_dtype=torch.bfloat16, device_map = "auto")
13
+ tokenizer = AutoTokenizer.from_pretrained("redscroll/Mistral-7B-v0.2")
14
+
15
+ input_text = "In my younger and more vulnerable years"
16
+
17
+ input_ids = tokenizer(input_text, return_tensors = "pt")
18
+
19
+ outputs = model.generate(**input_ids, max_new_tokens = 500)
20
+
21
+ print(tokenizer.decode(outputs[0]))
22
+ ```