| --- |
| library_name: tf-keras |
| --- |
| |
| ## Model description |
|
|
| More information needed |
|
|
| ## Intended uses & limitations |
|
|
| More information needed |
|
|
| ## Training and evaluation data |
|
|
| More information needed |
|
|
| ## Training procedure |
|
|
| ### Training hyperparameters |
|
|
| The following hyperparameters were used during training: |
|
|
| | Hyperparameters | Value | |
| | :-- | :-- | |
| | name | AdamWeightDecay | |
| | learning_rate.module | official.nlp.optimization | |
| | learning_rate.class_name | WarmUp | |
| | learning_rate.config.initial_learning_rate.class_name | __tensor__ | |
| | learning_rate.config.initial_learning_rate.config.value | 1.799999881768599e-05 | |
| | learning_rate.config.initial_learning_rate.config.dtype | float32 | |
| | learning_rate.config.decay_schedule_fn.module | keras.optimizers.schedules | |
| | learning_rate.config.decay_schedule_fn.class_name | PolynomialDecay | |
| | learning_rate.config.decay_schedule_fn.config.initial_learning_rate | 2e-05 | |
| | learning_rate.config.decay_schedule_fn.config.decay_steps | 5848 | |
| | learning_rate.config.decay_schedule_fn.config.end_learning_rate | 0 | |
| | learning_rate.config.decay_schedule_fn.config.power | 1.0 | |
| | learning_rate.config.decay_schedule_fn.config.cycle | False | |
| | learning_rate.config.decay_schedule_fn.config.name | None | |
| | learning_rate.config.decay_schedule_fn.registered_name | None | |
| | learning_rate.config.warmup_steps | 584.8000000000001 | |
| | learning_rate.config.power | 1.0 | |
| | learning_rate.config.name | None | |
| | learning_rate.registered_name | WarmUp | |
| | decay | 0.0 | |
| | beta_1 | 0.9 | |
| | beta_2 | 0.999 | |
| | epsilon | 1e-06 | |
| | amsgrad | False | |
| | weight_decay_rate | 0.95 | |
| | training_precision | float32 | |
|
|
|
|
| ## Model Plot |
|
|
| <details> |
| <summary>View Model Plot</summary> |
|
|
|  |
|
|
| </details> |