uploaded readme
Browse files
README.md
CHANGED
|
@@ -34,9 +34,9 @@ Our model hasn't been fine-tuned through reinforcement learning from human feedb
|
|
| 34 |
|
| 35 |
## How to Use
|
| 36 |
|
| 37 |
-
Phi-2
|
| 38 |
|
| 39 |
-
Phi-2 is known for having an attention overflow issue (with FP16). If you are facing this issue, please enable/disable autocast on the [PhiAttention.forward()](https://
|
| 40 |
|
| 41 |
## Intended Uses
|
| 42 |
|
|
|
|
| 34 |
|
| 35 |
## How to Use
|
| 36 |
|
| 37 |
+
Phi-2 has been integrated in the `transformers` version 4.37.0, please ensure that you are using a version equal or higher than it.
|
| 38 |
|
| 39 |
+
Phi-2 is known for having an attention overflow issue (with FP16). If you are facing this issue, please enable/disable autocast on the [PhiAttention.forward()](https://github.com/huggingface/transformers/blob/main/src/transformers/models/phi/modeling_phi.py#L306) function.
|
| 40 |
|
| 41 |
## Intended Uses
|
| 42 |
|