| ```CODE: |
| # Load model directly |
| from transformers import AutoModelForCausalLM |
| model = AutoModelForCausalLM.from_pretrained("deepseek-ai/DeepSeek-V3.2-Exp", torch_dtype="auto") |
| ``` |
|
|
| ERROR: |
| Traceback (most recent call last): |
| File "/tmp/.cache/uv/environments-v2/ca40b4f14047fc5a/lib/python3.13/site-packages/transformers/models/auto/configuration_auto.py", line 1360, in from_pretrained |
| config_class = CONFIG_MAPPING[config_dict["model_type"]] |
| ~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
| File "/tmp/.cache/uv/environments-v2/ca40b4f14047fc5a/lib/python3.13/site-packages/transformers/models/auto/configuration_auto.py", line 1048, in __getitem__ |
| raise KeyError(key) |
| KeyError: 'deepseek_v32' |
|
|
| During handling of the above exception, another exception occurred: |
|
|
| Traceback (most recent call last): |
| File "/tmp/deepseek-ai_DeepSeek-V3.2-Exp_1PWAFqQ.py", line 16, in <module> |
| model = AutoModelForCausalLM.from_pretrained("deepseek-ai/DeepSeek-V3.2-Exp", torch_dtype="auto") |
| File "/tmp/.cache/uv/environments-v2/ca40b4f14047fc5a/lib/python3.13/site-packages/transformers/models/auto/auto_factory.py", line 549, in from_pretrained |
| config, kwargs = AutoConfig.from_pretrained( |
| ~~~~~~~~~~~~~~~~~~~~~~~~~~^ |
| pretrained_model_name_or_path, |
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
| ...<4 lines>... |
| **kwargs, |
| ^^^^^^^^^ |
| ) |
| ^ |
| File "/tmp/.cache/uv/environments-v2/ca40b4f14047fc5a/lib/python3.13/site-packages/transformers/models/auto/configuration_auto.py", line 1362, in from_pretrained |
| raise ValueError( |
| ...<8 lines>... |
| ) |
| ValueError: The checkpoint you are trying to load has model type `deepseek_v32` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date. |
|
|
| You can update Transformers with the command `pip install --upgrade transformers`. If this does not work, and the checkpoint is very new, then there may not be a release version that supports this model yet. In this case, you can get the most up-to-date code by installing Transformers from source with the command `pip install git+https://github.com/huggingface/transformers.git` |
|
|