Patch for Model Card Example - Expected all tensors to be on the same device

#7
by preoccupy9217 - opened

Running the example listed on the Model Card generated the following errors:

/home/aac/projects/granite-4-tf-rocm/venv/lib/python3.12/site-packages/transformers/generation/utils.py:2479: UserWarning: You are calling .generate() with the `input_ids` being on a device type different than your model's device. `input_ids` is on cuda, whereas the model is on cpu. You may experience unexpected behaviors or slower generation. Please make sure that you have put `input_ids` to the correct device by calling for example input_ids = input_ids.to('cpu') before running `.generate()`. 

...

RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cpu and cuda:0! (when checking argument for argument index in method wrapper_CUDA__index_select)

The following patch resolved the error by appending ".to(device)" when creating model:

10c10
<     )
---
>     ).to(device)

Test system was Ubuntu 24.04.2 LTS, ROCm 6.4.0, with 2x AMD EPYC 7763 64-Core Processors and 1x AMD Instinct MI210 using pip3 install transformers[torch]@git+https://github.com/huggingface/transformers as of May 20, 2025.

Sign up or log in to comment