This is a Gemma model uploaded using the KerasNLP library and can be used with JAX, TensorFlow, and PyTorch backends.
This model is related to a CausalLM task.
Model config:
- name: gemma_backbone ( gemma2_instruct_2b_en )
- trainable: True
- vocabulary_size: 256000
- num_layers: 26
- num_query_heads: 8
- num_key_value_heads: 4
- hidden_dim: 2304
- intermediate_dim: 18432
- head_dim: 256
- layer_norm_epsilon: 1e-06
- dropout: 0
- query_head_dim_normalize: True
- use_post_ffw_norm: True
- use_post_attention_norm: True
- final_logit_soft_cap: 30.0
- attention_logit_soft_cap: 50.0
- sliding_window_size: 4096
- use_sliding_window_attention: True
- rank: 64
- epoch: 10
- dataset: dataset_path_ko_aihub_all_capital_area_moments_10k_[0-6]
This model card has been generated automatically and should be completed by the model author. See Model Cards documentation for more information.
- Downloads last month
- 7