Emaad commited on
Commit
39b1c94
·
1 Parent(s): 7459866

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +29 -14
README.md CHANGED
@@ -18,7 +18,7 @@ CELL-E 2 is the second iteration of the original [CELL-E](https://www.biorxiv.or
18
 
19
  CELL-E 2 is novel bidirectional transformer that can generate images depicting protein subcellular localization from the amino acid sequences (and *vice versa*).
20
  CELL-E 2 not only captures the spatial complexity of protein localization and produce probability estimates of localization atop a nucleus image, but also being able to generate sequences from images, enabling *de novo* protein design.
21
- We trained on the [Human Protein Atlas](https://www.proteinatlas.org) and the [OpenCell](https://opencell.czbiohub.org) datasets.
22
 
23
  CELL-E 2 utilizes pretrained amino acid embeddings from [ESM-2](https://github.com/facebookresearch/esm).
24
 
@@ -26,25 +26,40 @@ CELL-E 2 utilizes pretrained amino acid embeddings from [ESM-2](https://github.c
26
  ## Model variations
27
 
28
  We have made several versions of CELL-E 2 available. The naming scheme follows the structure ```training set_hidden size``` where the hidden size is set to the embedding dimension of the pretrained ESM-2 model.
 
 
29
 
30
- ### HPA Models:
 
31
 
32
- | Model | #params | Language |
33
  |------------------------|--------------------------------|-------|
34
- | [`bert-base-uncased`](https://huggingface.co/bert-base-uncased) | 110M | English |
35
- | [`bert-large-uncased`](https://huggingface.co/bert-large-uncased) | 340M | English | sub
36
- | [`bert-base-cased`](https://huggingface.co/bert-base-cased) | 110M | English |
37
- | [`bert-large-cased`](https://huggingface.co/bert-large-cased) | 340M | English |
38
 
39
- ## Intended uses & limitations
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
40
 
41
- You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to
42
- be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=bert) to look for
43
- fine-tuned versions of a task that interests you.
44
 
45
- Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)
46
- to make decisions, such as sequence classification, token classification or question answering. For tasks such as text
47
- generation you should look at model like GPT2.
48
 
49
  ### How to use
50
 
 
18
 
19
  CELL-E 2 is novel bidirectional transformer that can generate images depicting protein subcellular localization from the amino acid sequences (and *vice versa*).
20
  CELL-E 2 not only captures the spatial complexity of protein localization and produce probability estimates of localization atop a nucleus image, but also being able to generate sequences from images, enabling *de novo* protein design.
21
+ We trained on the [Human Protein Atlas](https://www.proteinatlas.org) (HPA) and the [OpenCell](https://opencell.czbiohub.org) datasets.
22
 
23
  CELL-E 2 utilizes pretrained amino acid embeddings from [ESM-2](https://github.com/facebookresearch/esm).
24
 
 
26
  ## Model variations
27
 
28
  We have made several versions of CELL-E 2 available. The naming scheme follows the structure ```training set_hidden size``` where the hidden size is set to the embedding dimension of the pretrained ESM-2 model.
29
+ We annotate the most useful models under Notes, however other models can be used if memory constraints are present.
30
+ Since these models share similarities with BERT, the embeddings from any of these models may be benefical for downstream tasks.
31
 
32
+ **HPA Models**:
33
+ HPA models are trained on the HPA dataset. They are best for general purpose predictions as they include a variety of cell types.
34
 
35
+ | Model | Size | Notes
36
  |------------------------|--------------------------------|-------|
37
+ | [`HPA_480`](https://huggingface.co/HuangLab/CELL-E_2_HPA_480) | 4.73 GB | **Best for Image Prediction** |
38
+ | [`HPA_640`](https://huggingface.co/HuangLab/CELL-E_2_HPA_640) | 6.31 GB | |
39
+ | [`HPA_1280`](https://huggingface.co/HuangLab/CELL-E_2_HPA_1280) | 10.8 GB | |
40
+ | [`HPA_2560`](https://huggingface.co/HuangLab/CELL-E_2_HPA_2560) | 17.5 GB | **Best for Sequence Prediction** |
41
 
42
+ **OpenCell Models**:
43
+ OpenCell models are trained on the OpenCell dataset. These only contain HEK cells and should ideally only be used for predictions on HEK cells. They perform well on image prediction but the generate heatmaps contain little information.
44
+
45
+ | Model | Size | Notes
46
+ |------------------------|--------------------------------|-------|
47
+ | [`HPA_480`](https://huggingface.co/HuangLab/CELL-E_2_OpenCell_480) | 4.73 GB | |
48
+ | [`HPA_640`](https://huggingface.co/HuangLab/CELL-E_2_OpenCell_640) | 6.31 GB | |
49
+ | [`HPA_1280`](https://huggingface.co/HuangLab/CELL-E_2_OpenCel_1280) | 10.8 GB | |
50
+ | [`HPA_2560`](https://huggingface.co/HuangLab/CELL-E_2_OpenCell_2560) | 17.5 GB | **Best for Sequence Prediction** |
51
+
52
+ **Finetuned HPA Models**:
53
+ These models were used the HPA models as checkpoints, but then were finetuned on the OpenCell dataset. We found that they improve image generation capabilities, but did not necessary see an improvement in sequence prediction.
54
+
55
+ | Model | Size | Notes
56
+ |------------------------|--------------------------------|-------|
57
+ | [`HPA_480`](https://huggingface.co/HuangLab/CELL-E_2_HPA_Finetuned_480) | 4.73 GB | **Best for Image Prediction** |
58
+ | [`HPA_640`](https://huggingface.co/HuangLab/CELL-E_2_HPA_Finetuned_640) | 6.31 GB | |
59
+ | [`HPA_1280`](https://huggingface.co/HuangLab/CELL-E_2_HPA_Finetuned_1280) | 10.8 GB | |
60
+ | [`HPA_2560`](https://huggingface.co/HuangLab/CELL-E_2_HPA_Finetuned_2560) | 17.5 GB | |
61
 
 
 
 
62
 
 
 
 
63
 
64
  ### How to use
65