Instructions to use kunalr63/simple_transformer with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use kunalr63/simple_transformer with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("token-classification", model="kunalr63/simple_transformer")# Load model directly from transformers import AutoTokenizer, AutoModelForTokenClassification tokenizer = AutoTokenizer.from_pretrained("kunalr63/simple_transformer") model = AutoModelForTokenClassification.from_pretrained("kunalr63/simple_transformer") - Notebooks
- Google Colab
- Kaggle
# Load model directly
from transformers import AutoTokenizer, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained("kunalr63/simple_transformer")
model = AutoModelForTokenClassification.from_pretrained("kunalr63/simple_transformer")Quick Links
No model card
- Downloads last month
- 16
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("token-classification", model="kunalr63/simple_transformer")