Instructions to use emaeon/trained_cppbert7 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use emaeon/trained_cppbert7 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="emaeon/trained_cppbert7")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("emaeon/trained_cppbert7") model = AutoModelForSequenceClassification.from_pretrained("emaeon/trained_cppbert7") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 557987b6896d5edb70c996bcbf4de564c4b43349ef8fa3e63d0359a3744448b5
- Size of remote file:
- 499 MB
- SHA256:
- c1faf9e9722dfdae556057b87fba2021afed6b563e6b978d929f34fad800c013
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.