File size: 867 Bytes
f65e11e 05d1fa0 30b25d1 05d1fa0 49a8182 30b25d1 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 | ---
library_name: transformers
tags: []
---
# Capital Saving Stated Aim Classifier
This is a roberta-base model that is trained to classify whether an explicit set of stated aims extracted from a British [historical patent](https://huggingface.co/datasets/matthewleechen/300YearsOfBritishPatents) includes a capital-saving objective.
Labels were generated manually and checked with Gemini 2.0 Flash using the attached [prompt](https://huggingface.co/matthewleechen/capital-saving_stated_aim_classifier/blob/main/capital-saving_prompt.txt).
Hyperparameters:
lr = 3e-5
batch size = 128
Test set results:
```text
{'eval_loss': 0.32574835419654846,
'eval_accuracy': 0.89,
'eval_precision': 0.8916686674669867,
'eval_recall': 0.89,
'eval_f1': 0.89003300330033,
'eval_runtime': 0.4104,
'eval_samples_per_second': 243.688,
'eval_steps_per_second': 2.437}
```
|