Instructions to use Synthyra/ESMplusplus_large with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Synthyra/ESMplusplus_large with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="Synthyra/ESMplusplus_large", trust_remote_code=True)# Load model directly from transformers import AutoModelForMaskedLM model = AutoModelForMaskedLM.from_pretrained("Synthyra/ESMplusplus_large", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Upload modeling_esm_plusplus.py with huggingface_hub
Browse files- modeling_esm_plusplus.py +1 -0
modeling_esm_plusplus.py
CHANGED
|
@@ -685,6 +685,7 @@ class EmbeddingMixin:
|
|
| 685 |
save: bool = True,
|
| 686 |
sql_db_path: str = 'embeddings.db',
|
| 687 |
save_path: str = 'embeddings.pth',
|
|
|
|
| 688 |
) -> Optional[dict[str, torch.Tensor]]:
|
| 689 |
"""Embed a dataset of protein sequences.
|
| 690 |
|
|
|
|
| 685 |
save: bool = True,
|
| 686 |
sql_db_path: str = 'embeddings.db',
|
| 687 |
save_path: str = 'embeddings.pth',
|
| 688 |
+
**kwargs,
|
| 689 |
) -> Optional[dict[str, torch.Tensor]]:
|
| 690 |
"""Embed a dataset of protein sequences.
|
| 691 |
|