Instructions to use ppower1/huggingface_train with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use ppower1/huggingface_train with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="ppower1/huggingface_train")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("ppower1/huggingface_train") model = AutoModelForSequenceClassification.from_pretrained("ppower1/huggingface_train") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- a9fcfbec762acbf758afd9af79a28684d11a3f5bc3086deec826761d986b565e
- Size of remote file:
- 268 MB
- SHA256:
- 915ab77987344c9f146929a6220acd5c829a381561b67963651245088f51b19a
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.