runtime error

Exit code: 1. Reason: ~~~~~~~~~~~~~~^ hidden_states=hidden_states, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<7 lines>... **kwargs, ^^^^^^^^^ ) ^ File "/root/.pyenv/versions/3.13.12/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1776, in _wrapped_call_impl return self._call_impl(*args, **kwargs) ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^ File "/root/.pyenv/versions/3.13.12/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1787, in _call_impl return forward_call(*args, **kwargs) File "/root/.cache/huggingface/modules/transformers_modules/UMBRANETWORK/Goblin_hyphen_Bruno_hyphen_Potato/18ef183b2415e58d129c0cbef6fc88c0f15d68d1/modeling_freedom_omega.py", line 174, in forward return self._forward_gqa( ~~~~~~~~~~~~~~~~~^ hidden_states, attention_mask, position_ids, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ past_key_value, output_attentions, use_cache, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ cache_position, position_embeddings, **kwargs ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "/root/.cache/huggingface/modules/transformers_modules/UMBRANETWORK/Goblin_hyphen_Bruno_hyphen_Potato/18ef183b2415e58d129c0cbef6fc88c0f15d68d1/modeling_freedom_omega.py", line 224, in _forward_gqa attn_output, attn_weights = self._compute_attention( ~~~~~~~~~~~~~~~~~~~~~~~^ query_states, key_states, value_states, attention_mask, output_attentions ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ) ^ File "/root/.cache/huggingface/modules/transformers_modules/UMBRANETWORK/Goblin_hyphen_Bruno_hyphen_Potato/18ef183b2415e58d129c0cbef6fc88c0f15d68d1/modeling_freedom_omega.py", line 455, in _compute_attention attn_output = torch.matmul(attn_weights, value_states) RuntimeError: Expected size for first two dimensions of batch2 tensor to be: [16, 30] but got: [16, 1].

Container logs:

Fetching error logs...