Merge pull request #61 from NanoCode012/feat/update-readme
Browse files
README.md
CHANGED
|
@@ -33,12 +33,12 @@ pip3 install -e .[int4]
|
|
| 33 |
|
| 34 |
accelerate config
|
| 35 |
|
| 36 |
-
# finetune
|
| 37 |
-
accelerate launch scripts/finetune.py examples/
|
| 38 |
|
| 39 |
# inference
|
| 40 |
-
accelerate launch scripts/finetune.py examples/
|
| 41 |
-
--inference --lora_model_dir="./
|
| 42 |
```
|
| 43 |
|
| 44 |
## Installation
|
|
@@ -199,8 +199,7 @@ datasets:
|
|
| 199 |
# The type of prompt to use for training. [alpaca, sharegpt, gpteacher, oasst, reflection]
|
| 200 |
type: alpaca # format OR format:prompt_style (chat/instruct)
|
| 201 |
data_files: # path to source data files
|
| 202 |
-
shards: #
|
| 203 |
-
shards: # number of shards to split dataset into
|
| 204 |
|
| 205 |
# axolotl attempts to save the dataset as an arrow after packing the data together so
|
| 206 |
# subsequent training attempts load faster, relative path
|
|
@@ -326,6 +325,9 @@ debug:
|
|
| 326 |
|
| 327 |
# Seed
|
| 328 |
seed:
|
|
|
|
|
|
|
|
|
|
| 329 |
```
|
| 330 |
|
| 331 |
</details>
|
|
@@ -382,6 +384,10 @@ Please reduce any below
|
|
| 382 |
|
| 383 |
Try set `fp16: true`
|
| 384 |
|
|
|
|
|
|
|
|
|
|
|
|
|
| 385 |
## Need help? πββοΈ
|
| 386 |
|
| 387 |
Join our [Discord server](https://discord.gg/HhrNrHJPRb) where we can help you
|
|
|
|
| 33 |
|
| 34 |
accelerate config
|
| 35 |
|
| 36 |
+
# finetune lora
|
| 37 |
+
accelerate launch scripts/finetune.py examples/lora-openllama-3b/config.yml
|
| 38 |
|
| 39 |
# inference
|
| 40 |
+
accelerate launch scripts/finetune.py examples/lora-openllama-3b/config.yml \
|
| 41 |
+
--inference --lora_model_dir="./lora-out"
|
| 42 |
```
|
| 43 |
|
| 44 |
## Installation
|
|
|
|
| 199 |
# The type of prompt to use for training. [alpaca, sharegpt, gpteacher, oasst, reflection]
|
| 200 |
type: alpaca # format OR format:prompt_style (chat/instruct)
|
| 201 |
data_files: # path to source data files
|
| 202 |
+
shards: # number of shards to split data into
|
|
|
|
| 203 |
|
| 204 |
# axolotl attempts to save the dataset as an arrow after packing the data together so
|
| 205 |
# subsequent training attempts load faster, relative path
|
|
|
|
| 325 |
|
| 326 |
# Seed
|
| 327 |
seed:
|
| 328 |
+
|
| 329 |
+
# Allow overwrite yml config using from cli
|
| 330 |
+
strict:
|
| 331 |
```
|
| 332 |
|
| 333 |
</details>
|
|
|
|
| 384 |
|
| 385 |
Try set `fp16: true`
|
| 386 |
|
| 387 |
+
> NotImplementedError: No operator found for `memory_efficient_attention_forward` ...
|
| 388 |
+
|
| 389 |
+
Try to turn off xformers.
|
| 390 |
+
|
| 391 |
## Need help? πββοΈ
|
| 392 |
|
| 393 |
Join our [Discord server](https://discord.gg/HhrNrHJPRb) where we can help you
|