vpj commited on
Commit
fa77955
·
2 Parent(s): 2a1c205 6213f0f

Merge branch 'main' of https://huggingface.co/GeoV/GeoV-9b into main

Browse files
Files changed (1) hide show
  1. README.md +27 -18
README.md CHANGED
@@ -8,13 +8,13 @@ license: bigscience-openrail-m
8
  ---
9
 
10
 
11
- [GeoV](https://huggingface.co/docs/transformers/model_doc/geov)-9B is a 9 billion parameter autoregressive language model.
12
 
13
  The GeoV model was designed by Georges Harik and uses
14
- [Rotary Positional Embeddings with Relative distances (RoPER)](http://research.labml.ai/RoPER.html)
15
- by [Georges Hark](https://twitter.com/ghark) and [Varuna Jayasiri](https://twitter.com/vpj).
16
 
17
- [RoPER]((http://research.labml.ai/RoPER.html),
18
  in addition to using relative positions in the attention score calculation by RoPE embeddings,
19
  adds relative positional information explicitly to value embeddings.
20
  Specifically, it incorporates the relative positions of the tokens paid attention to.
@@ -36,29 +36,38 @@ RoPER has given better performance in some algorithmic tasks, and seems comparab
36
  | n<sub>heads</sub> | 40 |
37
  | d<sub>head</sub> | 128 |
38
  | n<sub>vocab</sub> | 65500 |
39
- | Sequence Length | 2049 |
40
  </figure>
41
 
 
 
 
 
 
 
 
 
 
42
 
43
  ## Generation
44
 
45
- The `generate()` method can be used to generate text using GeoV model.
46
 
47
  ```python
48
- >>> from transformers import GeoVForCausalLM, GeoVTokenizer
49
 
50
- >>> model = GeoVForCausalLM.from_pretrained("GeoV/GeoV-9b")
51
- >>> tokenizer = GeoVTokenizer.from_pretrained("GeoV/GeoV-9b")
52
 
53
- >>> prompt = "In mathematics, topology is the study of"
54
 
55
- >>> input_ids = tokenizer(prompt, return_tensors="pt").input_ids
56
 
57
- >>> gen_tokens = model.generate(
58
- ... input_ids,
59
- ... do_sample=True,
60
- ... temperature=0.9,
61
- ... max_length=100,
62
- ... )
63
- >>> gen_text = tokenizer.batch_decode(gen_tokens)[0]
64
  ```
 
8
  ---
9
 
10
 
11
+ [GeoV](https://github.com/geov-ai/geov)-9B is a 9 billion parameter causal language model.
12
 
13
  The GeoV model was designed by Georges Harik and uses
14
+ [Rotary Positional Embeddings with Relative distances (RoPER)](https://research.labml.ai/RoPER.html)
15
+ by [Georges Harik](https://twitter.com/gharik) and [Varuna Jayasiri](https://twitter.com/vpj).
16
 
17
+ [RoPER](https://research.labml.ai/RoPER.html),
18
  in addition to using relative positions in the attention score calculation by RoPE embeddings,
19
  adds relative positional information explicitly to value embeddings.
20
  Specifically, it incorporates the relative positions of the tokens paid attention to.
 
36
  | n<sub>heads</sub> | 40 |
37
  | d<sub>head</sub> | 128 |
38
  | n<sub>vocab</sub> | 65500 |
39
+ | Sequence Length | 2048 |
40
  </figure>
41
 
42
+ The released weights were trained on ~70 billion tokens.
43
+ We plan to continue training up to 300 billion tokens and update the weights at every 20b tokens.
44
+ This training run is monolingual and uses c4en and english wikipedia datasets.
45
+
46
+ ## Installation
47
+
48
+ ```shell
49
+ pip install geov
50
+ ```
51
 
52
  ## Generation
53
 
54
+ [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/geov-ai/geov/blob/master/notebooks/generate.ipynb)
55
 
56
  ```python
57
+ from geov import GeoVForCausalLM, GeoVTokenizer
58
 
59
+ model = GeoVForCausalLM.from_pretrained("GeoV/GeoV-9b")
60
+ tokenizer = GeoVTokenizer.from_pretrained("GeoV/GeoV-9b")
61
 
62
+ prompt = "In mathematics, topology is the study of"
63
 
64
+ input_ids = tokenizer(prompt, return_tensors="pt").input_ids
65
 
66
+ gen_tokens = model.generate(
67
+ input_ids,
68
+ do_sample=True,
69
+ temperature=0.9,
70
+ max_length=100,
71
+ )
72
+ gen_text = tokenizer.batch_decode(gen_tokens)[0]
73
  ```