ariG23498 HF Staff commited on
Commit
2114cce
·
verified ·
1 Parent(s): ee4adef

Upload LGAI-EXAONE_K-EXAONE-236B-A23B_1.txt with huggingface_hub

Browse files
LGAI-EXAONE_K-EXAONE-236B-A23B_1.txt ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ```CODE:
2
+ # Load model directly
3
+ from transformers import AutoModelForCausalLM
4
+ model = AutoModelForCausalLM.from_pretrained("LGAI-EXAONE/K-EXAONE-236B-A23B", dtype="auto")
5
+ ```
6
+
7
+ ERROR:
8
+ Traceback (most recent call last):
9
+ File "/tmp/.cache/uv/environments-v2/638983b704b56622/lib/python3.13/site-packages/transformers/models/auto/configuration_auto.py", line 1384, in from_pretrained
10
+ config_class = CONFIG_MAPPING[config_dict["model_type"]]
11
+ ~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^
12
+ File "/tmp/.cache/uv/environments-v2/638983b704b56622/lib/python3.13/site-packages/transformers/models/auto/configuration_auto.py", line 1087, in __getitem__
13
+ raise KeyError(key)
14
+ KeyError: 'exaone_moe'
15
+
16
+ During handling of the above exception, another exception occurred:
17
+
18
+ Traceback (most recent call last):
19
+ File "/tmp/LGAI-EXAONE_K-EXAONE-236B-A23B_1T0Begs.py", line 25, in <module>
20
+ model = AutoModelForCausalLM.from_pretrained("LGAI-EXAONE/K-EXAONE-236B-A23B", dtype="auto")
21
+ File "/tmp/.cache/uv/environments-v2/638983b704b56622/lib/python3.13/site-packages/transformers/models/auto/auto_factory.py", line 317, in from_pretrained
22
+ config, kwargs = AutoConfig.from_pretrained(
23
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~^
24
+ pretrained_model_name_or_path,
25
+ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
26
+ ...<4 lines>...
27
+ **kwargs,
28
+ ^^^^^^^^^
29
+ )
30
+ ^
31
+ File "/tmp/.cache/uv/environments-v2/638983b704b56622/lib/python3.13/site-packages/transformers/models/auto/configuration_auto.py", line 1386, in from_pretrained
32
+ raise ValueError(
33
+ ...<8 lines>...
34
+ )
35
+ ValueError: The checkpoint you are trying to load has model type `exaone_moe` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
36
+
37
+ You can update Transformers with the command `pip install --upgrade transformers`. If this does not work, and the checkpoint is very new, then there may not be a release version that supports this model yet. In this case, you can get the most up-to-date code by installing Transformers from source with the command `pip install git+https://github.com/huggingface/transformers.git`