Update README.md
Browse files
README.md
CHANGED
|
@@ -18,12 +18,13 @@ Access to [git](https://github.com/yarongef/DistilProtBert)
|
|
| 18 |
|
| 19 |
DistilProtBert was pretrained on millions of proteins sequences.
|
| 20 |
|
|
|
|
| 21 |
Differences between DistilProtBert model and ProtBert:
|
| 22 |
|
| 23 |
| **Model** | **# of Parameters** | **# of Hidden layers** | **# of Pretraining sequences** | **Pretraining hardware** |
|
| 24 |
|:--------------:|:--------------:|:-----------------:|:-------------------------:|:------------------------:|
|
| 25 |
-
| ProtBert | 420M | 30 |
|
| 26 |
-
| DistilProtBert | 230M | 15 |
|
| 27 |
|
| 28 |
## Intended uses & limitations
|
| 29 |
|
|
|
|
| 18 |
|
| 19 |
DistilProtBert was pretrained on millions of proteins sequences.
|
| 20 |
|
| 21 |
+
|
| 22 |
Differences between DistilProtBert model and ProtBert:
|
| 23 |
|
| 24 |
| **Model** | **# of Parameters** | **# of Hidden layers** | **# of Pretraining sequences** | **Pretraining hardware** |
|
| 25 |
|:--------------:|:--------------:|:-----------------:|:-------------------------:|:------------------------:|
|
| 26 |
+
| ProtBert | 420M | 30 | 216M | 512 16GB TPUs |
|
| 27 |
+
| DistilProtBert | 230M | 15 | 43M | 5 v100 32GB GPUs |
|
| 28 |
|
| 29 |
## Intended uses & limitations
|
| 30 |
|