runtime error
Traceback (most recent call last): File "/home/user/app/app.py", line 15, in <module> tokenized_dataset = tokenizer([example[0] for example in dataset], return_tensors="pt", padding=True, truncation=True) File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2872, in __call__ encodings = self._call_one(text=text, text_pair=text_pair, **all_kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2958, in _call_one return self.batch_encode_plus( File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 3140, in batch_encode_plus padding_strategy, truncation_strategy, max_length, kwargs = self._get_padding_truncation_strategies( File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2777, in _get_padding_truncation_strategies raise ValueError( ValueError: Asking to pad but the tokenizer does not have a padding token. Please select a token to use as `pad_token` `(tokenizer.pad_token = tokenizer.eos_token e.g.)` or add a new pad token via `tokenizer.add_special_tokens({'pad_token': '[PAD]'})`.
Container logs:
Fetching error logs...