| | --- |
| | library_name: transformers |
| | tags: [] |
| | --- |
| | |
| | # Time Saving Stated Aim Classifier |
| |
|
| | This is a roberta-base model that is trained to classify whether a set of explicit stated aims extracted from a British [historical patent](https://huggingface.co/datasets/matthewleechen/300YearsOfBritishPatents) includes a time-saving objective. |
| |
|
| | Labels were manually generated and then checked with Gemini 2.0 Flash with the attached [prompt](https://huggingface.co/matthewleechen/time-saving_stated_aim_classifier/blob/main/time-saving_prompt.txt). |
| |
|
| | Hyperparameters: |
| | lr = 3e-5 |
| | batch size = 128 |
| |
|
| | Test set results: |
| |
|
| | ```text |
| | {'eval_loss': 0.5350350737571716, |
| | 'eval_accuracy': 0.89, |
| | 'eval_precision': 0.8972633319553535, |
| | 'eval_recall': 0.89, |
| | 'eval_f1': 0.8891426257718393, |
| | 'eval_runtime': 0.4081, |
| | 'eval_samples_per_second': 245.053, |
| | 'eval_steps_per_second': 2.451} |
| | ``` |
| | |
| | |