Instructions to use Guscode/DKbert-hatespeech-detection with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Guscode/DKbert-hatespeech-detection with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="Guscode/DKbert-hatespeech-detection")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("Guscode/DKbert-hatespeech-detection") model = AutoModelForSequenceClassification.from_pretrained("Guscode/DKbert-hatespeech-detection") - Notebooks
- Google Colab
- Kaggle
Update README.md
Browse files
README.md
CHANGED
|
@@ -36,7 +36,7 @@ See more on [DK hate github](https://github.com/Guscode/DKbert-hatespeech-detect
|
|
| 36 |
|
| 37 |
## Training procedure
|
| 38 |
|
| 39 |
-
- BOTXO Nordic Bert
|
| 40 |
- Learning rate: 1e-5,
|
| 41 |
- Batch size: 16
|
| 42 |
- Max sequence length: 128
|
|
|
|
| 36 |
|
| 37 |
## Training procedure
|
| 38 |
|
| 39 |
+
- [BOTXO Nordic Bert](https://huggingface.co/DJSammy/bert-base-danish-uncased_BotXO,ai)
|
| 40 |
- Learning rate: 1e-5,
|
| 41 |
- Batch size: 16
|
| 42 |
- Max sequence length: 128
|