Instructions to use Jibbscript/privacy-filter-oai with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Jibbscript/privacy-filter-oai with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("token-classification", model="Jibbscript/privacy-filter-oai")# Load model directly from transformers import AutoTokenizer, AutoModelForTokenClassification tokenizer = AutoTokenizer.from_pretrained("Jibbscript/privacy-filter-oai") model = AutoModelForTokenClassification.from_pretrained("Jibbscript/privacy-filter-oai") - Transformers.js
How to use Jibbscript/privacy-filter-oai with Transformers.js:
// npm i @huggingface/transformers import { pipeline } from '@huggingface/transformers'; // Allocate pipeline const pipe = await pipeline('token-classification', 'Jibbscript/privacy-filter-oai'); - Notebooks
- Google Colab
- Kaggle
File size: 234 Bytes
04ea9d7 | 1 2 3 4 5 6 7 8 9 10 11 12 | {
"backend": "tokenizers",
"eos_token": "<|endoftext|>",
"model_input_names": [
"input_ids",
"attention_mask"
],
"model_max_length": 128000,
"pad_token": "<|endoftext|>",
"tokenizer_class": "TokenizersBackend"
}
|