runtime error

Exit code: 1. Reason: zation_utils_base.py:1617: FutureWarning: `clean_up_tokenization_spaces` was not set. It will be set to `True` by default. This behavior will be deprecated in transformers v4.45, and will be then set to `False` by default. For more details check this issue: https://github.com/huggingface/transformers/issues/31884 warnings.warn( model.safetensors: 0%| | 0.00/714M [00:00<?, ?B/s] model.safetensors: 9%|ā–‰ | 67.0M/714M [00:01<00:15, 40.8MB/s] model.safetensors: 25%|ā–ˆā–ˆā– | 178M/714M [00:02<00:07, 71.0MB/s]  model.safetensors: 81%|ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆ | 580M/714M [00:03<00:00, 192MB/s]  model.safetensors: 100%|ā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆā–ˆ| 714M/714M [00:04<00:00, 177MB/s] Some weights of BertForTokenClassification were not initialized from the model checkpoint at bert-base-multilingual-cased and are newly initialized: ['classifier.bias'] You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference. āŒ Ų®Ų·Ų§ ŲÆŲ± بارگذاری Ł…ŲÆŁ„: Cannot copy out of meta tensor; no data! Traceback (most recent call last): File "/home/user/app/app.py", line 82, in <module> raise e File "/home/user/app/app.py", line 77, in <module> model = model.to(device) File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 2952, in to return super().to(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1160, in to return self._apply(convert) File "/usr/local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 810, in _apply module._apply(fn) File "/usr/local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 833, in _apply param_applied = fn(param) File "/usr/local/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1158, in convert return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking) NotImplementedError: Cannot copy out of meta tensor; no data!

Container logs:

Fetching error logs...