--- library_name: transformers tags: [] --- # Capital Saving Stated Aim Classifier This is a roberta-base model that is trained to classify whether an explicit set of stated aims extracted from a British [historical patent](https://huggingface.co/datasets/matthewleechen/300YearsOfBritishPatents) includes a capital-saving objective. Labels were generated manually and checked with Gemini 2.0 Flash using the attached [prompt](https://huggingface.co/matthewleechen/capital-saving_stated_aim_classifier/blob/main/capital-saving_prompt.txt). Hyperparameters: lr = 3e-5 batch size = 128 Test set results: ```text {'eval_loss': 0.32574835419654846, 'eval_accuracy': 0.89, 'eval_precision': 0.8916686674669867, 'eval_recall': 0.89, 'eval_f1': 0.89003300330033, 'eval_runtime': 0.4104, 'eval_samples_per_second': 243.688, 'eval_steps_per_second': 2.437} ```