Text Generation
Transformers
Safetensors
zaya
conversational
Files changed (1) hide show
  1. README.md +19 -2
README.md CHANGED
@@ -38,12 +38,29 @@ ZAYA1-base performs extremely competitively against other base models of a simil
38
 
39
  ### Prerequisites
40
 
41
- TODO instructions to use our branch
 
 
 
42
 
 
 
 
 
43
 
44
 
45
  ### Inference
46
 
47
- TODO
 
 
48
 
 
 
49
 
 
 
 
 
 
 
 
38
 
39
  ### Prerequisites
40
 
41
+ To use ZAYA1, install `zaya` branch from our fork of `transformers` library, which is based on the v4.57.1 of `transformers`:
42
+ ```bash
43
+ pip install "transformers @ git+https://github.com/Zyphra/transformers.git@zaya"
44
+ ```
45
 
46
+ The command above relies on requirements for `transformers v4.57.1` being installed in your environment. If you're installing in a fresh Python environment, you might want to specify a specific extra, like `[dev-torch]`, to install all the dependencies:
47
+ ```bash
48
+ pip install "transformers[dev-torch] @ git+https://github.com/Zyphra/transformers.git@zaya"
49
+ ```
50
 
51
 
52
  ### Inference
53
 
54
+ ```python
55
+ from transformers import AutoTokenizer, AutoModelForCausalLM
56
+ import torch
57
 
58
+ tokenizer = AutoTokenizer.from_pretrained("Zyphra/ZAYA1-base")
59
+ model = AutoModelForCausalLM.from_pretrained("Zyphra/ZAYA1-base", device_map="cuda", dtype=torch.bfloat16)
60
 
61
+ input_text = "What factors contributed to the fall of the Roman Empire?"
62
+ input_ids = tokenizer(input_text, return_tensors="pt").to("cuda")
63
+
64
+ outputs = model.generate(**input_ids, max_new_tokens=100)
65
+ print(tokenizer.decode(outputs[0]))
66
+ ```