Merge branch 'main' of https://huggingface.co/alchemab/antiberta2
Browse files
README.md
CHANGED
|
@@ -1,22 +1,25 @@
|
|
| 1 |
---
|
| 2 |
license: other
|
| 3 |
widget:
|
| 4 |
-
|
|
|
|
|
|
|
|
|
|
| 5 |
---
|
| 6 |
|
| 7 |
## AntiBERTa2 🧬
|
| 8 |
|
| 9 |
AntiBERTa2 is an antibody-specific language model based on the [RoFormer model](https://arxiv.org/abs/2104.09864) - it is pre-trained using masked language modelling.
|
| 10 |
We also provide a multimodal version of AntiBERTa2, AntiBERTa2-CSSP, that has been trained using a contrastive objective, similar to the [CLIP method](https://arxiv.org/abs/2103.00020).
|
| 11 |
-
Further details on both AntiBERTa2 and AntiBERTa2-CSSP are described in our [paper]() accepted at the NeurIPS MLSB Workshop 2023.
|
| 12 |
|
| 13 |
Both AntiBERTa2 models are only available for non-commercial use. Output antibody sequences (e.g. from infilling via masked language models) can only be used for
|
| 14 |
non-commercial use. For any users seeking commercial use of our model and generated antibodies, please reach out to us at [info@alchemab.com](mailto:info@alchemab.com).
|
| 15 |
|
| 16 |
| Model variant | Parameters | Config |
|
| 17 |
| ------------- | ---------- | ------ |
|
| 18 |
-
| [AntiBERTa2](https://huggingface.co/alchemab/antiberta2) | 202M |
|
| 19 |
-
| [AntiBERTa2-CSSP](https://huggingface.co/alchemab/antiberta2-cssp) | 202M |
|
| 20 |
|
| 21 |
## Example usage
|
| 22 |
|
|
@@ -38,4 +41,4 @@ non-commercial use. For any users seeking commercial use of our model and genera
|
|
| 38 |
# that a new linear layer will be added
|
| 39 |
# and randomly initialized
|
| 40 |
|
| 41 |
-
```
|
|
|
|
| 1 |
---
|
| 2 |
license: other
|
| 3 |
widget:
|
| 4 |
+
- text: Ḣ Q V Q [MASK] E
|
| 5 |
+
tags:
|
| 6 |
+
- biology
|
| 7 |
+
- medical
|
| 8 |
---
|
| 9 |
|
| 10 |
## AntiBERTa2 🧬
|
| 11 |
|
| 12 |
AntiBERTa2 is an antibody-specific language model based on the [RoFormer model](https://arxiv.org/abs/2104.09864) - it is pre-trained using masked language modelling.
|
| 13 |
We also provide a multimodal version of AntiBERTa2, AntiBERTa2-CSSP, that has been trained using a contrastive objective, similar to the [CLIP method](https://arxiv.org/abs/2103.00020).
|
| 14 |
+
Further details on both AntiBERTa2 and AntiBERTa2-CSSP are described in our [paper](https://www.mlsb.io/papers_2023/Enhancing_Antibody_Language_Models_with_Structural_Information.pdf) accepted at the NeurIPS MLSB Workshop 2023.
|
| 15 |
|
| 16 |
Both AntiBERTa2 models are only available for non-commercial use. Output antibody sequences (e.g. from infilling via masked language models) can only be used for
|
| 17 |
non-commercial use. For any users seeking commercial use of our model and generated antibodies, please reach out to us at [info@alchemab.com](mailto:info@alchemab.com).
|
| 18 |
|
| 19 |
| Model variant | Parameters | Config |
|
| 20 |
| ------------- | ---------- | ------ |
|
| 21 |
+
| [AntiBERTa2](https://huggingface.co/alchemab/antiberta2) | 202M | 16L, 16H, 1024d |
|
| 22 |
+
| [AntiBERTa2-CSSP](https://huggingface.co/alchemab/antiberta2-cssp) | 202M | 16L, 16H, 1024d |
|
| 23 |
|
| 24 |
## Example usage
|
| 25 |
|
|
|
|
| 41 |
# that a new linear layer will be added
|
| 42 |
# and randomly initialized
|
| 43 |
|
| 44 |
+
```
|