Instructions to use recursionpharma/OpenPhenom with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use recursionpharma/OpenPhenom with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="recursionpharma/OpenPhenom", trust_remote_code=True)# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("recursionpharma/OpenPhenom", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Update README.md
Browse files
README.md
CHANGED
|
@@ -96,9 +96,6 @@ def test_model_predict(huggingface_model, C, return_channelwise_embeddings):
|
|
| 96 |
```
|
| 97 |
We also provide a [notebook](https://huggingface.co/recursionpharma/OpenPhenom/blob/main/RxRx3-core_inference.ipynb) for running inference on [RxRx3-core](https://huggingface.co/datasets/recursionpharma/rxrx3-core).
|
| 98 |
|
| 99 |
-
**Note: Currently, the model cannot be loaded via the `AutoModel` available in HuggingFace transformers library**
|
| 100 |
-
|
| 101 |
-
|
| 102 |
## Training, evaluation and testing details
|
| 103 |
|
| 104 |
See paper linked above for details on model training and evaluation. Primary hyperparameters are included in the repo linked above.
|
|
|
|
| 96 |
```
|
| 97 |
We also provide a [notebook](https://huggingface.co/recursionpharma/OpenPhenom/blob/main/RxRx3-core_inference.ipynb) for running inference on [RxRx3-core](https://huggingface.co/datasets/recursionpharma/rxrx3-core).
|
| 98 |
|
|
|
|
|
|
|
|
|
|
| 99 |
## Training, evaluation and testing details
|
| 100 |
|
| 101 |
See paper linked above for details on model training and evaluation. Primary hyperparameters are included in the repo linked above.
|