runtime error
Exit code: 1. Reason: ce_download=True`. warnings.warn( special_tokens_map.json: 0%| | 0.00/369 [00:00<?, ?B/s][A special_tokens_map.json: 100%|██████████| 369/369 [00:00<00:00, 2.48MB/s] tokenizer_config.json: 0%| | 0.00/4.32k [00:00<?, ?B/s][A tokenizer_config.json: 100%|██████████| 4.32k/4.32k [00:00<00:00, 36.7MB/s] You are using the legacy behaviour of the <class 'transformers.models.llama.tokenization_llama.LlamaTokenizer'>. This means that tokens that come after special tokens will not be properly handled. We recommend you to read the related pull request available at https://github.com/huggingface/transformers/pull/24565 Error loading tokenizer: not a string config.json: 0%| | 0.00/752 [00:00<?, ?B/s][A config.json: 100%|██████████| 752/752 [00:00<00:00, 9.17MB/s] Traceback (most recent call last): File "/home/user/app/app.py", line 26, in <module> model = AutoModelForCausalLM.from_pretrained(MODEL_NAME, force_download=True, device_map="cpu", torch_dtype=torch.float16) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 461, in from_pretrained config, kwargs = AutoConfig.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 999, in from_pretrained return config_class.from_dict(config_dict, **unused_kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/configuration_utils.py", line 744, in from_dict config = cls(**config_dict) File "/usr/local/lib/python3.10/site-packages/transformers/models/llama/configuration_llama.py", line 145, in __init__ self._rope_scaling_validation() File "/usr/local/lib/python3.10/site-packages/transformers/models/llama/configuration_llama.py", line 163, in _rope_scaling_validation raise ValueError( ValueError: `rope_scaling` must be a dictionary with with two fields, `name` and `factor`, got {'factor': 4.0, 'rope_type': 'linear', 'type': 'linear'}
Container logs:
Fetching error logs...