Instructions to use Synthyra/ESMplusplus_large with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Synthyra/ESMplusplus_large with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="Synthyra/ESMplusplus_large", trust_remote_code=True)# Load model directly from transformers import AutoModelForMaskedLM model = AutoModelForMaskedLM.from_pretrained("Synthyra/ESMplusplus_large", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Upload LICENSE with huggingface_hub
Browse files
LICENSE
CHANGED
|
@@ -1,4 +1,4 @@
|
|
| 1 |
-
PLEASE NOTE THE APACHE LICENSE ONLY APPLIES TO THE CODE IN THE FastPLMs GITHUB AND ASSOCIATED HUGGINGFACE REPOSITORIES, NOT NECESSARILY THE MODEL WEIGHTS. THOSE LICENSES CAN BE FOUND HERE https://github.com/Synthyra/FastPLMs/tree/main/licenses
|
| 2 |
|
| 3 |
Apache License
|
| 4 |
Version 2.0, January 2004
|
|
|
|
| 1 |
+
PLEASE NOTE THE APACHE LICENSE ONLY APPLIES TO THE CODE IN THE FastPLMs GITHUB AND ASSOCIATED HUGGINGFACE REPOSITORIES, NOT NECESSARILY THE MODEL WEIGHTS. THOSE LICENSES CAN BE FOUND HERE https://github.com/Synthyra/FastPLMs/tree/main/licenses
|
| 2 |
|
| 3 |
Apache License
|
| 4 |
Version 2.0, January 2004
|