JuIm commited on
Commit
4522306
·
verified ·
1 Parent(s): f7e6d72

End of training

Browse files
README.md CHANGED
@@ -12,7 +12,37 @@ should probably proofread and complete it, then remove this comment. -->
12
 
13
  # ProGemma
14
 
15
- This is a custom configuration (275M parameters) of a Gemma 2 LLM that is being pre-trained on a corpus of amino acid sequences with the goal of generating de novo amino acid sequences in a zero-shot fashion. The model has currently been trained on 55% of the dataset as of 8.14.2024. A new version of the model will be continually pushed as training hits new checkpoints. Preliminary evaluation demonstrates perplexity scores, HHblit E-values, pLDDT, pTM, and ipTM scores on par with ProtGPT2. A full evaluation will be done upon completion of training. This model can be easily accessed using the transformers.pipeline function. For the tokenizer, use "JuIm/Amino-Acid-Sequence-Tokenizer".
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
16
 
17
  ### Framework versions
18
 
 
12
 
13
  # ProGemma
14
 
15
+ This model is a fine-tuned version of [JuIm/ProGemma](https://huggingface.co/JuIm/ProGemma) on an unknown dataset.
16
+
17
+ ## Model description
18
+
19
+ More information needed
20
+
21
+ ## Intended uses & limitations
22
+
23
+ More information needed
24
+
25
+ ## Training and evaluation data
26
+
27
+ More information needed
28
+
29
+ ## Training procedure
30
+
31
+ ### Training hyperparameters
32
+
33
+ The following hyperparameters were used during training:
34
+ - learning_rate: 0.001
35
+ - train_batch_size: 1
36
+ - eval_batch_size: 8
37
+ - seed: 42
38
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
39
+ - lr_scheduler_type: linear
40
+ - lr_scheduler_warmup_ratio: 0.4
41
+ - training_steps: 7000
42
+
43
+ ### Training results
44
+
45
+
46
 
47
  ### Framework versions
48
 
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:22c63b4ca962e7b70326be91958475f858a8b14f5124c6675d85c69c1a35c272
3
  size 1101271208
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bd0f9cc3e7cf4392ebbce95a26fef4d5d268fff42d3381ce8c0d9ce9d8ba8616
3
  size 1101271208
runs/Aug15_18-47-15_084ff3810925/events.out.tfevents.1723747641.084ff3810925.1011.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ae8f2f0dd836ce535827233cb8031ad2a8794ab5c9bc93474f9252d421aa9f6d
3
+ size 1481827
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:57ac5a24a724dafa875652a541a59872d27c542a121bfe00222bf246d11afb8b
3
  size 5112
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6ec12736f6421b826abf232f8c0dc0d65d69780d43b0d3ce994675dcc36e5165
3
  size 5112