smares commited on
Commit
a7d2045
·
verified ·
1 Parent(s): ca6c5a9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +17 -23
README.md CHANGED
@@ -34,36 +34,24 @@ This guide shows a new user how to:
34
  Install the basics:
35
 
36
  ```bash
37
- pip install -U torch huggingface_hub
38
- # optional speed up for large files
39
- pip install -U hf_transfer
40
- ```
41
-
42
- If you prefer conda:
43
-
44
- ```bash
45
- # example only - adjust paths to your cluster
46
 
47
- conda create -y -n ESMCBA python=3.10
48
- conda activate ESMCBA
49
 
50
- # Core dependencies
51
- pip install torch==2.6.0 transformers==4.46.3 esm==3.1.3 \
52
- biopython==1.85 umap-learn==0.5.7 scikit-learn==1.6.1 \
53
- seaborn==0.13.2 pandas==2.2.3 matplotlib==3.10.1
54
 
55
- # For downloading model checkpoints from Hugging Face
56
- pip install -U huggingface_hub
57
-
58
- # Optional: speed up large file downloads
59
- pip install -U hf_transfer
60
  ```
61
 
62
  ## 2. Get the code
63
 
64
  ```bash
65
  git clone https://github.com/sermare/ESMCBA
66
- cd ESMCBA
67
  ```
68
 
69
  Inside the repo you should have `embeddings.py` available. If your file is named `embeddings_generation.py`, use that name instead in the commands below.
@@ -88,8 +76,12 @@ You can browse all files here: https://huggingface.co/smares/ESMCBA
88
  # download everything to ./models
89
  hf download smares/ESMCBA --repo-type model --local-dir ./models
90
 
91
- # or download a single file to ./models
92
- hf download smares/ESMCBA "ESMCBA_epitope_0.5_20_ESMMASK_epitope_FT_15_0.0001_1e-05_AUG_6_HLAB5101_5_0.001_1e-06__3_B5101_Hubber_B5101_final.pth" --repo-type model --local-dir ./models
 
 
 
 
93
  ```
94
 
95
  ### Option B — rely on the Hugging Face cache
@@ -121,6 +113,8 @@ python3 embeddings_generation.py --model_path ./models/ESMCBA_epitope_0.5_20_E
121
  If `embeddings_generation.py` supports resolving from the Hub, you can pass either a file name or an `hf://` path and let the script download to cache.
122
 
123
  ```bash
 
 
124
  python3 embeddings_generation.py --model_path "ESMCBA_epitope_0.95_30_ESMMASK_epitope_FT_25_0.001_5e-05_AUG_3_HLAB1402_2_1e-05_1e-06__1_B1402_0404_Hubber_B1402_final.pth" --name B1402-ESMCBA --hla B1402 --encoding epitope --output_dir ./outputs --peptides ASCQQQRAGHS ASCQQQRAGH ASCQQQRAG DVRLSAHHHR DVRLSAHHHRM GHSDVRLSAHH
125
  ```
126
 
 
34
  Install the basics:
35
 
36
  ```bash
37
+ # Install core PyTorch and Transformers ecosystem
38
+ pip install torch
39
+ pip install transformers
40
+ pip install esm
 
 
 
 
 
41
 
42
+ # Install Hugging Face Hub utilities
43
+ pip install "huggingface-hub<1.0"
44
 
45
+ # Optional: Install hf_transfer for faster large file downloads
46
+ pip install hf_transfer
 
 
47
 
48
+ pip install biopython umap-learn scikit-learn seaborn pandas matplotlib
 
 
 
 
49
  ```
50
 
51
  ## 2. Get the code
52
 
53
  ```bash
54
  git clone https://github.com/sermare/ESMCBA
 
55
  ```
56
 
57
  Inside the repo you should have `embeddings.py` available. If your file is named `embeddings_generation.py`, use that name instead in the commands below.
 
76
  # download everything to ./models
77
  hf download smares/ESMCBA --repo-type model --local-dir ./models
78
 
79
+ #or just get one model
80
+ huggingface-cli download smares/ESMCBA \
81
+ "ESMCBA_epitope_0.8_30_ESMMASK_epitope_FT_5_0.001_1e-06_AUG_6_HLAA0201_2_0.001_1e-06__2_A0201_Hubber_A0201_final.pth" \
82
+ --repo-type model \
83
+ --local-dir ./models
84
+
85
  ```
86
 
87
  ### Option B — rely on the Hugging Face cache
 
113
  If `embeddings_generation.py` supports resolving from the Hub, you can pass either a file name or an `hf://` path and let the script download to cache.
114
 
115
  ```bash
116
+ cd ESMCBA/ESMCBA
117
+
118
  python3 embeddings_generation.py --model_path "ESMCBA_epitope_0.95_30_ESMMASK_epitope_FT_25_0.001_5e-05_AUG_3_HLAB1402_2_1e-05_1e-06__1_B1402_0404_Hubber_B1402_final.pth" --name B1402-ESMCBA --hla B1402 --encoding epitope --output_dir ./outputs --peptides ASCQQQRAGHS ASCQQQRAGH ASCQQQRAG DVRLSAHHHR DVRLSAHHHRM GHSDVRLSAHH
119
  ```
120