Update README.md
Browse files
README.md
CHANGED
|
@@ -73,7 +73,7 @@ See below for information about the models' performance in different in-silico t
|
|
| 73 |
## **Model description**
|
| 74 |
|
| 75 |
REXzyme is based on the [Efficient T5 Large Transformer](https://huggingface.co/google/t5-efficient-large) architecture (which in turn is very similar to the current version of Google Translator)
|
| 76 |
-
and contains 48 (24
|
| 77 |
|
| 78 |
REXzyme is a translation machine trained on portion the RHEA database containing 31970152 reaction-enzyme pairs.
|
| 79 |
The pre-training was done on pairs of smiles and amino acid sequences, tokenized with a char-level
|
|
|
|
| 73 |
## **Model description**
|
| 74 |
|
| 75 |
REXzyme is based on the [Efficient T5 Large Transformer](https://huggingface.co/google/t5-efficient-large) architecture (which in turn is very similar to the current version of Google Translator)
|
| 76 |
+
and contains 48 (24 encoder/ 24 decoder) layers with a model dimensionality of 1024, totaling 737.72 million parameters.
|
| 77 |
|
| 78 |
REXzyme is a translation machine trained on portion the RHEA database containing 31970152 reaction-enzyme pairs.
|
| 79 |
The pre-training was done on pairs of smiles and amino acid sequences, tokenized with a char-level
|