Update README.md
Browse filesNot keras now, we use transformers at this time.
README.md
CHANGED
|
@@ -37,7 +37,7 @@ Models are trained on a context length of 8192 tokens, which is equivalent to Ge
|
|
| 37 |
|
| 38 |
### Usage
|
| 39 |
|
| 40 |
-
Below we share some code snippets on how to get quickly started with running the model. First make sure to `pip install -U
|
| 41 |
|
| 42 |
#### Running the model with transformers
|
| 43 |
[](https://colab.research.google.com/github/deveworld/Gemago/blob/main/Gemago_2b_Infer.ipynb)
|
|
|
|
| 37 |
|
| 38 |
### Usage
|
| 39 |
|
| 40 |
+
Below we share some code snippets on how to get quickly started with running the model. First make sure to `pip install -U transformers`, then copy the snippet from the section that is relevant for your usecase.
|
| 41 |
|
| 42 |
#### Running the model with transformers
|
| 43 |
[](https://colab.research.google.com/github/deveworld/Gemago/blob/main/Gemago_2b_Infer.ipynb)
|