Instructions to use prelington/CodexTrouter with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- fastText
How to use prelington/CodexTrouter with fastText:
from huggingface_hub import hf_hub_download import fasttext model = fasttext.load_model(hf_hub_download("prelington/CodexTrouter", "model.bin")) - Notebooks
- Google Colab
- Kaggle
Create ProTalk_ModelExporter.py
Browse files- ProTalk_ModelExporter.py +6 -0
ProTalk_ModelExporter.py
ADDED
|
@@ -0,0 +1,6 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
from transformers import AutoModelForCausalLM
|
| 2 |
+
from safetensors.torch import save_file
|
| 3 |
+
|
| 4 |
+
model_path = "microsoft/phi-2"
|
| 5 |
+
model = AutoModelForCausalLM.from_pretrained(model_path, torch_dtype="auto")
|
| 6 |
+
save_file(model.state_dict(), "ProTalkModel.safetensors")
|