Commit ·
880b48d
1
Parent(s): 5e15ece
Update README.md
Browse files
README.md
CHANGED
|
@@ -69,7 +69,8 @@ The attention mechanism in a transformer model is designed to capture global dep
|
|
| 69 |
'''
|
| 70 |
```
|
| 71 |
|
| 72 |
-
|
|
|
|
| 73 |
## Evaluation
|
| 74 |
|
| 75 |
<B>TODO</B>
|
|
|
|
| 69 |
'''
|
| 70 |
```
|
| 71 |
|
| 72 |
+
## Finetuning details
|
| 73 |
+
The finetuning scripts will be available in our [RAIL Github Repository](https://github.com/vmware-labs/research-and-development-artificial-intelligence-lab/tree/main/instruction-tuning)
|
| 74 |
## Evaluation
|
| 75 |
|
| 76 |
<B>TODO</B>
|