Instructions to use recursionpharma/OpenPhenom with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use recursionpharma/OpenPhenom with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="recursionpharma/OpenPhenom", trust_remote_code=True)# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("recursionpharma/OpenPhenom", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Kian Kenyon-Dean commited on
Update README.md
Browse files
README.md
CHANGED
|
@@ -3,7 +3,8 @@ Official repo for Recursion's accepted spotlight paper at [NeurIPS 2023 Generati
|
|
| 3 |
|
| 4 |
Paper: https://arxiv.org/abs/2309.16064
|
| 5 |
|
| 6 |
-

|
| 7 |
+
|
| 8 |
|
| 9 |
## Provided code
|
| 10 |
The baseline Vision Transformer architecture backbone used in this work can be built with the following code snippet from Timm:
|