Instructions to use ppower1/huggingface_train with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use ppower1/huggingface_train with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="ppower1/huggingface_train")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("ppower1/huggingface_train") model = AutoModelForSequenceClassification.from_pretrained("ppower1/huggingface_train") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 158b47f4ca465d41c4d5439b1281b3d684c516b0336bed9be05f81588703c50b
- Size of remote file:
- 4.03 kB
- SHA256:
- 7dd55487d269cf14674112f91404e11278c59616f5efc7f7e90a35964b049e1d
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.