Instructions to use recursionpharma/OpenPhenom with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use recursionpharma/OpenPhenom with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="recursionpharma/OpenPhenom", trust_remote_code=True)# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("recursionpharma/OpenPhenom", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
Kian Kenyon-Dean commited on
Update README.md
Browse files
README.md
CHANGED
|
@@ -31,5 +31,7 @@ def vit_base_patch16_256(**kwargs):
|
|
| 31 |
|
| 32 |
Additional code will be released as the date of the workshop gets closer.
|
| 33 |
|
|
|
|
|
|
|
| 34 |
## Provided models
|
| 35 |
Stay tuned...
|
|
|
|
| 31 |
|
| 32 |
Additional code will be released as the date of the workshop gets closer.
|
| 33 |
|
| 34 |
+
**While we cannot share all the internal code we've written training and evaluation of these models, it would be very useful if interested persons could raise an Issue in this repo to inform us as to what the most useful aspects of the code for this project would be of interest to the broader community.**
|
| 35 |
+
|
| 36 |
## Provided models
|
| 37 |
Stay tuned...
|