HowToSD's picture
Create model card.
920cef2 verified
metadata
license: apache-2.0

Model Card for text_prompt_safety_checker

Model Description

This is a safety checker model designed to detect NSFW word(s) in text. The intended use case is for an text to image generator.

This model is based on the google-bert/bert-base-uncased model and has been fine-tuned for binary classification.

Training Data

Custom textual training data (NSFW text instances as positive and SFW text instances as negative) was used.

Model Code

Inference code is available at https://github.com/HowToSD/cremage/tree/main/modules/text_prompt_safety_checker. The model license specified here is not applicable to the source code above.

Intended Use

  • This model is intended for detecting NSFW word(s) in text prompts.

Limitations

  • Classification using this model is not foolproof and may not effectively screen all NSFW words or semantic concepts in the prompt. By choosing to use this model, users acknowledge and accept the limitations and risks associated with it.

License

The model weights are licensed under the Apache License 2.0. The license applies only to the model weights and not to the model code.