Update README.md
Browse files
README.md
CHANGED
|
@@ -12,9 +12,9 @@ should probably proofread and complete it, then remove this comment. -->
|
|
| 12 |
|
| 13 |
# ProGemma2
|
| 14 |
|
| 15 |
-
This is a custom configuration of Google’s Gemma 2 LLM that is being pre-trained on amino acid sequences of 512 AA or less in length. Periodic updates are made to this page as training reaches new checkpoints.
|
| 16 |
|
| 17 |
-
The purpose of this model was to investigate the differences between ProGemma and ProtGPT (GPT-2 architecture) as it pertains to sequence generation.
|
| 18 |
|
| 19 |
Controlled generation is not a capability of this model, and therefore serves as a method to significantly improve generation as, in principal, a sequence that performs a given function or resides in a particular cellular location can be generated.
|
| 20 |
|
|
|
|
| 12 |
|
| 13 |
# ProGemma2
|
| 14 |
|
| 15 |
+
This is a custom configuration of Google’s Gemma 2 LLM (335M parameters) that is being pre-trained on amino acid sequences of 512 AA or less in length. Periodic updates are made to this page as training reaches new checkpoints.
|
| 16 |
|
| 17 |
+
The purpose of this model was to investigate the differences between ProGemma and ProtGPT (GPT-2 architecture) as it pertains to sequence generation.
|
| 18 |
|
| 19 |
Controlled generation is not a capability of this model, and therefore serves as a method to significantly improve generation as, in principal, a sequence that performs a given function or resides in a particular cellular location can be generated.
|
| 20 |
|