Update README.md
Browse files
README.md
CHANGED
|
@@ -1,8 +1,7 @@
|
|
| 1 |
---
|
| 2 |
tags:
|
| 3 |
- protein language model
|
| 4 |
-
|
| 5 |
-
- IEDB
|
| 6 |
---
|
| 7 |
|
| 8 |
# PDeepPP model
|
|
@@ -42,11 +41,12 @@ To use `PDeepPP`, you need to install the required dependencies, including `torc
|
|
| 42 |
```bash
|
| 43 |
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118
|
| 44 |
pip install transformers
|
|
|
|
| 45 |
|
| 46 |
Here is an example of how to use PDeepPP to process protein sequences and obtain predictions:
|
| 47 |
|
| 48 |
Example for PTM mode:
|
| 49 |
-
|
| 50 |
import torch
|
| 51 |
from transformers import AutoModel, AutoTokenizer
|
| 52 |
|
|
@@ -68,8 +68,9 @@ inputs = processor(sequences=protein_sequences, ptm_mode=True, return_tensors="p
|
|
| 68 |
model.eval()
|
| 69 |
outputs = model(**inputs)
|
| 70 |
print(outputs["logits"])
|
| 71 |
-
|
| 72 |
Example for BPS mode:
|
|
|
|
| 73 |
import torch
|
| 74 |
from transformers import AutoModel
|
| 75 |
|
|
@@ -91,6 +92,7 @@ inputs = processor(sequences=protein_sequences, ptm_mode=False, overlapping=True
|
|
| 91 |
model.eval()
|
| 92 |
outputs = model(**inputs)
|
| 93 |
print(outputs["logits"])
|
|
|
|
| 94 |
|
| 95 |
Training and customization
|
| 96 |
PDeepPP supports fine-tuning on custom datasets. The model uses a configuration class (PDeepPPConfig) to specify hyperparameters such as:
|
|
@@ -104,9 +106,11 @@ Refer to PDeepPPConfig for details.
|
|
| 104 |
Citation
|
| 105 |
If you use PDeepPP in your research, please cite the associated paper or repository:
|
| 106 |
|
|
|
|
| 107 |
@article{your_reference,
|
| 108 |
title={PDeepPP: A Hybrid Model for Protein Sequence Analysis},
|
| 109 |
author={Author Name},
|
| 110 |
journal={Journal Name},
|
| 111 |
year={2025}
|
| 112 |
-
}
|
|
|
|
|
|
| 1 |
---
|
| 2 |
tags:
|
| 3 |
- protein language model
|
| 4 |
+
pipeline_tag: text-classification
|
|
|
|
| 5 |
---
|
| 6 |
|
| 7 |
# PDeepPP model
|
|
|
|
| 41 |
```bash
|
| 42 |
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118
|
| 43 |
pip install transformers
|
| 44 |
+
```
|
| 45 |
|
| 46 |
Here is an example of how to use PDeepPP to process protein sequences and obtain predictions:
|
| 47 |
|
| 48 |
Example for PTM mode:
|
| 49 |
+
```python
|
| 50 |
import torch
|
| 51 |
from transformers import AutoModel, AutoTokenizer
|
| 52 |
|
|
|
|
| 68 |
model.eval()
|
| 69 |
outputs = model(**inputs)
|
| 70 |
print(outputs["logits"])
|
| 71 |
+
```
|
| 72 |
Example for BPS mode:
|
| 73 |
+
```python
|
| 74 |
import torch
|
| 75 |
from transformers import AutoModel
|
| 76 |
|
|
|
|
| 92 |
model.eval()
|
| 93 |
outputs = model(**inputs)
|
| 94 |
print(outputs["logits"])
|
| 95 |
+
```
|
| 96 |
|
| 97 |
Training and customization
|
| 98 |
PDeepPP supports fine-tuning on custom datasets. The model uses a configuration class (PDeepPPConfig) to specify hyperparameters such as:
|
|
|
|
| 106 |
Citation
|
| 107 |
If you use PDeepPP in your research, please cite the associated paper or repository:
|
| 108 |
|
| 109 |
+
```
|
| 110 |
@article{your_reference,
|
| 111 |
title={PDeepPP: A Hybrid Model for Protein Sequence Analysis},
|
| 112 |
author={Author Name},
|
| 113 |
journal={Journal Name},
|
| 114 |
year={2025}
|
| 115 |
+
}
|
| 116 |
+
```
|