Instructions to use Clementio/PLRS with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Clementio/PLRS with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("Clementio/PLRS", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Upload requirements.txt with huggingface_hub
Browse files- requirements.txt +7 -8
requirements.txt
CHANGED
|
@@ -1,8 +1,7 @@
|
|
| 1 |
-
streamlit=
|
| 2 |
-
torch=
|
| 3 |
-
|
| 4 |
-
|
| 5 |
-
|
| 6 |
-
|
| 7 |
-
|
| 8 |
-
huggingface_hub==0.20.0
|
|
|
|
| 1 |
+
streamlit>=1.32.0
|
| 2 |
+
torch>=2.9.0
|
| 3 |
+
pandas>=2.0.0
|
| 4 |
+
numpy>=1.24.0
|
| 5 |
+
networkx>=3.1
|
| 6 |
+
scikit-learn>=1.3.0
|
| 7 |
+
huggingface_hub>=0.20.0
|
|
|