Text Classification
Transformers
Safetensors
English
modernbert
security
jailbreak-detection
prompt-injection
llm-safety
Eval Results (legacy)
text-embeddings-inference
Instructions to use rootfs/function-call-sentinel with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use rootfs/function-call-sentinel with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="rootfs/function-call-sentinel")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("rootfs/function-call-sentinel") model = AutoModelForSequenceClassification.from_pretrained("rootfs/function-call-sentinel") - Notebooks
- Google Colab
- Kaggle
Add hidden_act for Candle compatibility
Browse files- config.json +3 -2
config.json
CHANGED
|
@@ -49,5 +49,6 @@
|
|
| 49 |
"sparse_pred_ignore_index": -100,
|
| 50 |
"sparse_prediction": false,
|
| 51 |
"transformers_version": "4.57.3",
|
| 52 |
-
"vocab_size": 50368
|
| 53 |
-
|
|
|
|
|
|
| 49 |
"sparse_pred_ignore_index": -100,
|
| 50 |
"sparse_prediction": false,
|
| 51 |
"transformers_version": "4.57.3",
|
| 52 |
+
"vocab_size": 50368,
|
| 53 |
+
"hidden_act": "gelu"
|
| 54 |
+
}
|