Instructions to use Narsil/small3 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Narsil/small3 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("token-classification", model="Narsil/small3")# Load model directly from transformers import AutoTokenizer, AutoModelForTokenClassification tokenizer = AutoTokenizer.from_pretrained("Narsil/small3") model = AutoModelForTokenClassification.from_pretrained("Narsil/small3") - Notebooks
- Google Colab
- Kaggle
Adds /home/user/.cache/huggingface/hub/models--Narsil--small3/snapshots/168cd3642ac18283d738d173cdb74953ea7c35ea/model-00002-of-00003.safetensors
Browse files
model-00002-of-00003.safetensors
ADDED
|
Binary file (232 kB). View file
|
|
|