Text Classification
Transformers
Safetensors
PyTorch
English
deberta-v2
facebook
meta
llama
llama-3
text-embeddings-inference
Instructions to use meta-llama/Prompt-Guard-86M with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use meta-llama/Prompt-Guard-86M with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="meta-llama/Prompt-Guard-86M")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("meta-llama/Prompt-Guard-86M") model = AutoModelForSequenceClassification.from_pretrained("meta-llama/Prompt-Guard-86M") - Inference
- Notebooks
- Google Colab
- Kaggle
Updating links to the correct path
#25
by betodepaola - opened
No description provided.
vontimitta changed pull request status to merged
Exclusive project of here technology of blockchain.
Mamamama