ribesstefano's picture
Upload tokenizer
225b24a verified
---
base_model: seyonec/ChemBERTa-zinc-base-v1
library_name: transformers
license: mit
tags:
- PROTAC
- cheminformatics
- generated_from_trainer
model-index:
- name: ailab-bio/PROTAC-Splitter-EncoderDecoder-lr_reduce-opt25
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ailab-bio/PROTAC-Splitter-EncoderDecoder-lr_reduce-opt25
This model is a fine-tuned version of [seyonec/ChemBERTa-zinc-base-v1](https://huggingface.co/seyonec/ChemBERTa-zinc-base-v1) on the ailab-bio/PROTAC-Splitter-Dataset dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3420
- Num Fragments: 3.0002
- Linker Heavy Atoms Difference: 0.1689
- Linker Graph Edit Distance: 37181303116147309234962303999400365269286085344573508414341120.0000
- Tanimoto Similarity: 0.0
- Linker Tanimoto Similarity: 0.0
- E3 Valid: 0.9732
- Linker Has Attachment Point(s): 0.9963
- Poi Equal: 0.7897
- Heavy Atoms Difference: 8.0244
- Poi Has Attachment Point(s): 0.9305
- E3 Equal: 0.8302
- Linker Graph Edit Distance Norm: inf
- E3 Has Attachment Point(s): 0.9732
- Has Three Substructures: 0.9995
- Poi Heavy Atoms Difference Norm: 0.0690
- Linker Equal: 0.8419
- Heavy Atoms Difference Norm: 0.1076
- E3 Heavy Atoms Difference: 1.0454
- Poi Valid: 0.9305
- Valid: 0.9027
- Linker Heavy Atoms Difference Norm: -0.0046
- Has All Attachment Points: 0.9796
- E3 Graph Edit Distance: inf
- Linker Valid: 0.9963
- Poi Tanimoto Similarity: 0.0
- Poi Graph Edit Distance Norm: inf
- Poi Heavy Atoms Difference: 2.0482
- Poi Graph Edit Distance: inf
- Reassembly Nostereo: 0.6261
- E3 Graph Edit Distance Norm: inf
- E3 Heavy Atoms Difference Norm: 0.0335
- Reassembly: 0.6073
- All Ligands Equal: 0.5992
- E3 Tanimoto Similarity: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 128
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: reduce_lr_on_plateau
- training_steps: 100000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Num Fragments | Linker Heavy Atoms Difference | Linker Graph Edit Distance | Tanimoto Similarity | Linker Tanimoto Similarity | E3 Valid | Linker Has Attachment Point(s) | Poi Equal | Heavy Atoms Difference | Poi Has Attachment Point(s) | E3 Equal | Linker Graph Edit Distance Norm | E3 Has Attachment Point(s) | Has Three Substructures | Poi Heavy Atoms Difference Norm | Linker Equal | Heavy Atoms Difference Norm | E3 Heavy Atoms Difference | Poi Valid | Valid | Linker Heavy Atoms Difference Norm | Has All Attachment Points | E3 Graph Edit Distance | Linker Valid | Poi Tanimoto Similarity | Poi Graph Edit Distance Norm | Poi Heavy Atoms Difference | Poi Graph Edit Distance | Reassembly Nostereo | E3 Graph Edit Distance Norm | E3 Heavy Atoms Difference Norm | Reassembly | All Ligands Equal | E3 Tanimoto Similarity |
|:-------------:|:------:|:------:|:---------------:|:-------------:|:-----------------------------:|:-------------------------------------------------------------------:|:-------------------:|:--------------------------:|:--------:|:------------------------------:|:---------:|:----------------------:|:---------------------------:|:--------:|:-------------------------------:|:--------------------------:|:-----------------------:|:-------------------------------:|:------------:|:---------------------------:|:-------------------------:|:---------:|:------:|:----------------------------------:|:-------------------------:|:----------------------:|:------------:|:-----------------------:|:----------------------------:|:--------------------------:|:--------------------------------------------------------------------:|:-------------------:|:---------------------------:|:------------------------------:|:----------:|:-----------------:|:----------------------:|
| 0.0005 | 7.8911 | 80000 | 0.3375 | 2.9994 | 0.2223 | inf | 0.0 | 0.0 | 0.9717 | 0.9961 | 0.7867 | 9.0605 | 0.9179 | 0.8268 | inf | 0.9717 | 0.9994 | 0.0813 | 0.8393 | 0.1211 | 0.9062 | 0.9179 | 0.8893 | 0.0019 | 0.9788 | inf | 0.9961 | 0.0 | inf | 2.4517 | 820644475920679939503384490879847350906393395732940812913213440.0000 | 0.6211 | inf | 0.0334 | 0.6011 | 0.5936 | 0.0 |
| 0.0005 | 9.8639 | 100000 | 0.3420 | 3.0002 | 0.1689 | 37181303116147309234962303999400365269286085344573508414341120.0000 | 0.0 | 0.0 | 0.9732 | 0.9963 | 0.7897 | 8.0244 | 0.9305 | 0.8302 | inf | 0.9732 | 0.9995 | 0.0690 | 0.8419 | 0.1076 | 1.0454 | 0.9305 | 0.9027 | -0.0046 | 0.9796 | inf | 0.9963 | 0.0 | inf | 2.0482 | inf | 0.6261 | inf | 0.0335 | 0.6073 | 0.5992 | 0.0 |
### Framework versions
- Transformers 4.44.2
- Pytorch 2.4.1+cu121
- Datasets 3.0.0
- Tokenizers 0.19.1