Update README.md
Browse files
README.md
CHANGED
|
@@ -9,7 +9,7 @@ license: mit
|
|
| 9 |
|
| 10 |
Please check the [official repository](https://github.com/microsoft/DeBERTa) for more details and updates.
|
| 11 |
|
| 12 |
-
This the DeBERTa V2 xlarge model with 24 layers, 1536 hidden size. Total parameters 900M.
|
| 13 |
|
| 14 |
#### Fine-tuning on NLU tasks
|
| 15 |
|
|
|
|
| 9 |
|
| 10 |
Please check the [official repository](https://github.com/microsoft/DeBERTa) for more details and updates.
|
| 11 |
|
| 12 |
+
This the DeBERTa V2 xlarge model with 24 layers, 1536 hidden size. Total parameters 900M. It's trained with 160GB data.
|
| 13 |
|
| 14 |
#### Fine-tuning on NLU tasks
|
| 15 |
|