arnavgrg commited on
Commit
70ff731
·
1 Parent(s): 172cd26

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -1
README.md CHANGED
@@ -15,5 +15,9 @@ To use this model, you can just load it via `transformers` in fp16:
15
  import torch
16
  from transformers import AutoModelForCausalLM
17
 
18
- model = AutoModelForCausalLM.from_pretrained("arnavgrg/codallama-7b-instruct-nf4-fp16-upscaled", torch_dtype=torch.float16)
 
 
 
 
19
  ```
 
15
  import torch
16
  from transformers import AutoModelForCausalLM
17
 
18
+ model = AutoModelForCausalLM.from_pretrained(
19
+ "arnavgrg/codallama-7b-instruct-nf4-fp16-upscaled",
20
+ device_map="auto",
21
+ torch_dtype=torch.float16
22
+ )
23
  ```