Commit ·
3636257
1
Parent(s): 7708c08
Add model with Git LFS support
Browse files- README.md +39 -0
- config.json +17 -0
- model.safetensors +3 -0
- training_args.bin +0 -0
README.md
CHANGED
|
@@ -1,3 +1,42 @@
|
|
| 1 |
---
|
| 2 |
license: mit
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 3 |
---
|
|
|
|
| 1 |
---
|
| 2 |
license: mit
|
| 3 |
+
tags:
|
| 4 |
+
- truthfulness
|
| 5 |
+
- bert
|
| 6 |
+
- text-classification
|
| 7 |
+
- dual-classifier
|
| 8 |
+
pipeline_tag: text-classification
|
| 9 |
+
---
|
| 10 |
+
|
| 11 |
+
# Truthfulness Detection Model
|
| 12 |
+
|
| 13 |
+
Fine-tuned BERT model for detecting truthfulness in text at both token and sentence levels.
|
| 14 |
+
|
| 15 |
+
## Model Description
|
| 16 |
+
|
| 17 |
+
This model uses a dual-classifier architecture on top of BERT to:
|
| 18 |
+
- Classify truthfulness at the sentence level (returns probability 0-1)
|
| 19 |
+
- Classify truthfulness for each token (returns probability 0-1 per token)
|
| 20 |
+
|
| 21 |
+
Low scores indicate likely false statements, high scores indicate likely true statements.
|
| 22 |
+
|
| 23 |
+
## Example Output
|
| 24 |
+
|
| 25 |
+
For "The earth is flat.":
|
| 26 |
+
- Sentence score: 0.0736 (7.36% - correctly identified as false)
|
| 27 |
+
- Token scores: ~0.10 for each token
|
| 28 |
+
|
| 29 |
+
## Training
|
| 30 |
+
|
| 31 |
+
- Base model: bert-base-uncased
|
| 32 |
+
- Training samples: 6,330
|
| 33 |
+
- Epochs: 3
|
| 34 |
+
- Batch size: 16
|
| 35 |
+
- Training time: 49 seconds on H100
|
| 36 |
+
|
| 37 |
+
## Custom Architecture Required
|
| 38 |
+
|
| 39 |
+
⚠️ This model uses a custom `BERTForDualTruthfulness` class. You cannot load it with standard AutoModel.
|
| 40 |
+
See the [implementation code](https://huggingface.co/prompterminal/classifier/blob/main/model_architecture.py) for the model class definition.---
|
| 41 |
+
license: mit
|
| 42 |
---
|
config.json
ADDED
|
@@ -0,0 +1,17 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"architectures": ["BERTForDualTruthfulness"],
|
| 3 |
+
"model_type": "bert",
|
| 4 |
+
"hidden_size": 768,
|
| 5 |
+
"num_labels": 2,
|
| 6 |
+
"vocab_size": 30522,
|
| 7 |
+
"hidden_dropout_prob": 0.1,
|
| 8 |
+
"attention_probs_dropout_prob": 0.1,
|
| 9 |
+
"max_position_embeddings": 512,
|
| 10 |
+
"type_vocab_size": 2,
|
| 11 |
+
"initializer_range": 0.02,
|
| 12 |
+
"layer_norm_eps": 1e-12,
|
| 13 |
+
"num_attention_heads": 12,
|
| 14 |
+
"num_hidden_layers": 12,
|
| 15 |
+
"intermediate_size": 3072,
|
| 16 |
+
"_name_or_path": "bert-base-uncased"
|
| 17 |
+
}
|
model.safetensors
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:d60413c7b238b1536b5b1cb96f5394cf1e1bde5182362545999f1b95395e8863
|
| 3 |
+
size 437965064
|
training_args.bin
ADDED
|
Binary file (5.65 kB). View file
|
|
|