metadata
license: mit
base_model: gpt2
tags:
- generated_from_trainer
model-index:
- name: Model_1B_Bush
results: []
Model_1B_Bush
This model is a fine-tuned version of gpt2 on a large corpus of George W. Bush's first term discourse on terrorism.
To Prompt the Model
Try entering single words or short phrases, such as "terrorism is" or "national security" or "our foreign policy should be", in the dialogue box on the right hand side of this page. Then click on 'compute' and wait for the results. The model will take a few seconds to load on your first prompt.
Intended uses & limitations
This model is intended as an experiment on the utility of LLMs for discourse analysis on a specific corpus of political rhetoric.
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5.0
Framework versions
- Transformers 4.36.0.dev0
- Pytorch 2.1.0+cu118
- Datasets 2.15.0
- Tokenizers 0.15.0