fondress commited on
Commit
6829651
·
verified ·
1 Parent(s): 77f526b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +19 -17
README.md CHANGED
@@ -43,17 +43,17 @@ pip install torch torchvision torchaudio --index-url https://download.pytorch.or
43
  pip install transformers
44
  ```
45
 
46
- Here is an example of how to use PDeepPP to process protein sequences and obtain predictions:
47
 
48
- Example for PTM mode:
49
  ```python
50
  import torch
51
  from transformers import AutoModel, AutoTokenizer
52
 
53
- # Load PDeepPP model
54
  device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
55
  print(f"Using {device} device")
56
- model = AutoModel.from_pretrained("YourModelName/PDeepPP", trust_remote_code=True)
57
  model.to(device)
58
 
59
  # Example protein sequences
@@ -69,15 +69,15 @@ model.eval()
69
  outputs = model(**inputs)
70
  print(outputs["logits"])
71
  ```
72
- Example for BPS mode:
73
  ```python
74
  import torch
75
  from transformers import AutoModel
76
 
77
- # Load PDeepPP model
78
  device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
79
  print(f"Using {device} device")
80
- model = AutoModel.from_pretrained("YourModelName/PDeepPP", trust_remote_code=True)
81
  model.to(device)
82
 
83
  # Example protein sequences
@@ -94,21 +94,23 @@ outputs = model(**inputs)
94
  print(outputs["logits"])
95
  ```
96
 
97
- Training and customization
98
- PDeepPP supports fine-tuning on custom datasets. The model uses a configuration class (PDeepPPConfig) to specify hyperparameters such as:
99
 
100
- Number of transformer layers
101
- Hidden layer size
102
- Dropout rate
103
- PTM type and other task-specific parameters
104
- Refer to PDeepPPConfig for details.
105
 
106
- Citation
107
- If you use PDeepPP in your research, please cite the associated paper or repository:
 
 
 
 
 
 
 
108
 
109
  ```
110
  @article{your_reference,
111
- title={PDeepPP: A Hybrid Model for Protein Sequence Analysis},
112
  author={Author Name},
113
  journal={Journal Name},
114
  year={2025}
 
43
  pip install transformers
44
  ```
45
 
46
+ Here is an example of how to use `PDeepPP` to process protein sequences and obtain predictions:
47
 
48
+ ### Example for PTM mode:
49
  ```python
50
  import torch
51
  from transformers import AutoModel, AutoTokenizer
52
 
53
+ # Load `PDeepPP` model
54
  device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
55
  print(f"Using {device} device")
56
+ model = AutoModel.from_pretrained("YourModelName/`PDeepPP`", trust_remote_code=True)
57
  model.to(device)
58
 
59
  # Example protein sequences
 
69
  outputs = model(**inputs)
70
  print(outputs["logits"])
71
  ```
72
+ ### Example for BPS mode:
73
  ```python
74
  import torch
75
  from transformers import AutoModel
76
 
77
+ # Load `PDeepPP` model
78
  device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
79
  print(f"Using {device} device")
80
+ model = AutoModel.from_pretrained("YourModelName/`PDeepPP`", trust_remote_code=True)
81
  model.to(device)
82
 
83
  # Example protein sequences
 
94
  print(outputs["logits"])
95
  ```
96
 
97
+ ## Training and customization
 
98
 
99
+ `PDeepPP` supports fine-tuning on custom datasets. The model uses a configuration class (`PDeepPPConfig`) to specify hyperparameters such as:
 
 
 
 
100
 
101
+ - **Number of transformer layers**
102
+ - **Hidden layer size**
103
+ - **Dropout rate**
104
+ - **PTM type** and other task-specific parameters
105
+
106
+ Refer to `PDeepPPConfig` for details.
107
+
108
+ ## Citation
109
+ If you use `PDeepPP` in your research, please cite the associated paper or repository:
110
 
111
  ```
112
  @article{your_reference,
113
+ title={`PDeepPP`: A Hybrid Model for Protein Sequence Analysis},
114
  author={Author Name},
115
  journal={Journal Name},
116
  year={2025}