runtime error
Exit code: 1. Reason: oints, so we will NOT tie them. You should update the config with `tie_word_embeddings=False` to silence this warning generation_config.json: 0%| | 0.00/116 [00:00<?, ?B/s][A generation_config.json: 100%|ββββββββββ| 116/116 [00:00<00:00, 260kB/s] Traceback (most recent call last): File "/app/app.py", line 21, in <module> chat = pipeline("text-generation", model=model, tokenizer=tokenizer) File "/usr/local/lib/python3.13/site-packages/transformers/pipelines/__init__.py", line 1028, in pipeline return pipeline_class(model=model, task=task, **kwargs) File "/usr/local/lib/python3.13/site-packages/transformers/pipelines/text_generation.py", line 100, in __init__ super().__init__(*args, **kwargs) ~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.13/site-packages/transformers/pipelines/base.py", line 861, in __init__ self.model.to(self.device) ~~~~~~~~~~~~~^^^^^^^^^^^^^ File "/usr/local/lib/python3.13/site-packages/transformers/modeling_utils.py", line 3529, in to return super().to(*args, **kwargs) ~~~~~~~~~~^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1381, in to return self._apply(convert) ~~~~~~~~~~~^^^^^^^^^ File "/usr/local/lib/python3.13/site-packages/torch/nn/modules/module.py", line 933, in _apply module._apply(fn) ~~~~~~~~~~~~~^^^^ File "/usr/local/lib/python3.13/site-packages/torch/nn/modules/module.py", line 933, in _apply module._apply(fn) ~~~~~~~~~~~~~^^^^ File "/usr/local/lib/python3.13/site-packages/torch/nn/modules/module.py", line 964, in _apply param_applied = fn(param) File "/usr/local/lib/python3.13/site-packages/torch/nn/modules/module.py", line 1374, in convert raise NotImplementedError( ...<2 lines>... ) from None NotImplementedError: Cannot copy out of meta tensor; no data! Please use torch.nn.Module.to_empty() instead of torch.nn.Module.to() when moving module from meta to a different device.
Container logs:
Fetching error logs...