runtime error
Exit code: 1. Reason: o quantization)... config.json: 0%| | 0.00/1.97k [00:00<?, ?B/s][A config.json: 100%|██████████| 1.97k/1.97k [00:00<00:00, 15.0MB/s] `torch_dtype` is deprecated! Use `dtype` instead! Traceback (most recent call last): File "/app/app.py", line 25, in <module> model = AutoModelForCausalLM.from_pretrained( MODEL_NAME, ...<4 lines>... quantization_config=None # 🔥 الØÙ„ الأساسي ) File "/usr/local/lib/python3.13/site-packages/transformers/models/auto/auto_factory.py", line 549, in from_pretrained config, kwargs = AutoConfig.from_pretrained( ~~~~~~~~~~~~~~~~~~~~~~~~~~^ pretrained_model_name_or_path, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ...<4 lines>... **kwargs, ^^^^^^^^^ ) ^ File "/usr/local/lib/python3.13/site-packages/transformers/models/auto/configuration_auto.py", line 1372, in from_pretrained return config_class.from_dict(config_dict, **unused_kwargs) ~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.13/site-packages/transformers/configuration_utils.py", line 839, in from_dict logger.info(f"Model config {config}") ^^^^^^^^ File "/usr/local/lib/python3.13/site-packages/transformers/configuration_utils.py", line 873, in __repr__ return f"{self.__class__.__name__} {self.to_json_string()}" ~~~~~~~~~~~~~~~~~~~^^ File "/usr/local/lib/python3.13/site-packages/transformers/configuration_utils.py", line 985, in to_json_string config_dict = self.to_diff_dict() File "/usr/local/lib/python3.13/site-packages/transformers/configuration_utils.py", line 887, in to_diff_dict config_dict = self.to_dict() File "/usr/local/lib/python3.13/site-packages/transformers/configuration_utils.py", line 964, in to_dict self.quantization_config.to_dict() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ AttributeError: 'NoneType' object has no attribute 'to_dict'
Container logs:
Fetching error logs...