Update README.md
Browse files
README.md
CHANGED
|
@@ -2,11 +2,10 @@
|
|
| 2 |
license: cc-by-nc-nd-4.0
|
| 3 |
---
|
| 4 |
|
|
|
|
| 5 |
|
| 6 |

|
| 7 |
|
| 8 |
-
# PepDoRA: A Modified Peptide-Specific Language Model via Weight-Decomposed Low-Rank Adaptation
|
| 9 |
-
|
| 10 |
In this work, we introduce **PepDoRA**, a SMILES transformer that fine-tunes the state-of-the-art [ChemBERTa-77M-MLM](https://huggingface.co/DeepChem/ChemBERTa-77M-MLM) transformer on modified peptide SMILES via [DoRA](https://nbasyl.github.io/DoRA-project-page/), a novel PEFT method that incorporates weight decomposition. These representations can be leveraged for numerous downstream tasks, including membrane permeability prediction and target binding assessment, for both unmodified and modified peptide sequences.
|
| 11 |
|
| 12 |
Here's how to extract PepDoRA embeddings for your input peptide:
|
|
|
|
| 2 |
license: cc-by-nc-nd-4.0
|
| 3 |
---
|
| 4 |
|
| 5 |
+
# PepDoRA: A Unified Peptide-Specific Language Model via Weight-Decomposed Low-Rank Adaptation
|
| 6 |
|
| 7 |

|
| 8 |
|
|
|
|
|
|
|
| 9 |
In this work, we introduce **PepDoRA**, a SMILES transformer that fine-tunes the state-of-the-art [ChemBERTa-77M-MLM](https://huggingface.co/DeepChem/ChemBERTa-77M-MLM) transformer on modified peptide SMILES via [DoRA](https://nbasyl.github.io/DoRA-project-page/), a novel PEFT method that incorporates weight decomposition. These representations can be leveraged for numerous downstream tasks, including membrane permeability prediction and target binding assessment, for both unmodified and modified peptide sequences.
|
| 10 |
|
| 11 |
Here's how to extract PepDoRA embeddings for your input peptide:
|