JuIm commited on
Commit
6ceb66d
·
verified ·
1 Parent(s): d1e3bb7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -12,11 +12,11 @@ should probably proofread and complete it, then remove this comment. -->
12
 
13
  # ProGemma
14
 
15
- This is a custom configuration of Google's Gemma 2 model that is being pre-trained on amino acid sequences of lengths 0 to 512. I used the free version of Google Colab to train this model, so updates are made regularly as the model hits new checkpoints. As of 08.05.2024, the model has been trained on about 20% of the dataset.
16
 
17
  The model generates amino acids on a letter-by-letter basis.
18
 
19
- Current training loss is about 2.7. Preliminary evaluation of generated sequences on AlphaFold 3 shows pTM scores of ~0.4 and average pLLDT scores ~60. After training is complete, a proper evaluation will be done to see whether sequences result in proteins with a low free energy. Perplexity scores will also be calculated.
20
 
21
  The purpose of this model was to see whether I could develop an alternative to NVIDIA's ProtGPT2. ProGemma also serves as a stepping stone to a new model that will also utilize control tags to generate proteins based on function.
22
 
 
12
 
13
  # ProGemma
14
 
15
+ This is a custom configuration of Google's Gemma 2 model that is being pre-trained on amino acid sequences of lengths 0 to 512. I used the free version of Google Colab to train this model, so updates are made regularly as the model hits new checkpoints. As of 08.08.2024, the model has been trained on about 25% of the dataset.
16
 
17
  The model generates amino acids on a letter-by-letter basis.
18
 
19
+ Current training loss is about 2.7. Preliminary evaluation of generated sequences on AlphaFold 3 shows pTM scores of ~0.4 and average pLLDT scores ~60. After training is complete, a proper evaluation will be done to see whether sequences result in proteins with a low free energy. Perplexity scores are on par with NVIDIA's ProtGPT2 for a generated sequence of a given length.
20
 
21
  The purpose of this model was to see whether I could develop an alternative to NVIDIA's ProtGPT2. ProGemma also serves as a stepping stone to a new model that will also utilize control tags to generate proteins based on function.
22