SkywalkerLu commited on
Commit
b429e09
·
verified ·
1 Parent(s): f0e6d01

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -7
README.md CHANGED
@@ -1,4 +1,4 @@
1
- # TransHLA2.0
2
 
3
  A minimal Hugging Face-compatible PyTorch model for peptide–HLA binding classification using ESM with optional LoRA and cross-attention. There is no custom predict API; inference follows the training path: tokenize peptide and HLA pseudosequence with the ESM tokenizer, pad or truncate to fixed lengths (default peptide=16, HLA=36), run a forward pass as `logits, features = model(epitope_ids, hla_ids)`, then apply softmax to get the binding probability.
4
 
@@ -13,8 +13,9 @@ Requirements:
13
  Install:
14
  ```bash
15
  pip install torch transformers peft
16
-
17
-
 
18
  import torch
19
  import torch.nn.functional as F
20
  from transformers import AutoModel, AutoTokenizer
@@ -33,9 +34,7 @@ tok = AutoTokenizer.from_pretrained("facebook/esm2_t33_650M_UR50D")
33
  peptide = "GILGFVFTL" # 9-mer example
34
  # Fake placeholder pseudosequence for demo; replace with a real one from your mapping/data
35
  hla_pseudoseq = (
36
- "GSHSMRYFYTAVSRPGRGEPRFIAVGYVDDTQFVRFDSDAASPRMEPRAPWIEQEGPEYWERETRNVK"
37
- "AQSQTDRVDLRTLLRYNQSEAGSHTVQRMYGCDVGSDWRFLRGYHQYAYDGKDYIALNEDLRSWTAAD"
38
- "MAAQTTKHKWEQAGAAER"
39
  )
40
 
41
  # Fixed lengths (must match training)
@@ -64,4 +63,6 @@ with torch.no_grad():
64
  prob_bind = F.softmax(logits, dim=1)[0, 1].item()
65
  pred = int(prob_bind >= 0.5)
66
 
67
- print({"peptide": peptide, "bind_prob": round(prob_bind, 6), "label": pred})
 
 
 
1
+ # TransHLA2.0-BIND
2
 
3
  A minimal Hugging Face-compatible PyTorch model for peptide–HLA binding classification using ESM with optional LoRA and cross-attention. There is no custom predict API; inference follows the training path: tokenize peptide and HLA pseudosequence with the ESM tokenizer, pad or truncate to fixed lengths (default peptide=16, HLA=36), run a forward pass as `logits, features = model(epitope_ids, hla_ids)`, then apply softmax to get the binding probability.
4
 
 
13
  Install:
14
  ```bash
15
  pip install torch transformers peft
16
+ ```
17
+ How to use TransHLA2.0-BIND
18
+ ```python
19
  import torch
20
  import torch.nn.functional as F
21
  from transformers import AutoModel, AutoTokenizer
 
34
  peptide = "GILGFVFTL" # 9-mer example
35
  # Fake placeholder pseudosequence for demo; replace with a real one from your mapping/data
36
  hla_pseudoseq = (
37
+ "YYSEYRNIYAQTDESNLYLSYDYYTWAERAYEWY"
 
 
38
  )
39
 
40
  # Fixed lengths (must match training)
 
63
  prob_bind = F.softmax(logits, dim=1)[0, 1].item()
64
  pred = int(prob_bind >= 0.5)
65
 
66
+ print({"peptide": peptide, "bind_prob": round(prob_bind, 6), "label": pred})
67
+ ```
68
+