ariG23498 HF Staff commited on
Commit
7e27261
·
verified ·
1 Parent(s): 869e105

Upload LGAI-EXAONE_K-EXAONE-236B-A23B_0.txt with huggingface_hub

Browse files
LGAI-EXAONE_K-EXAONE-236B-A23B_0.txt ADDED
@@ -0,0 +1,36 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ```CODE:
2
+ # Use a pipeline as a high-level helper
3
+ from transformers import pipeline
4
+
5
+ pipe = pipeline("text-generation", model="LGAI-EXAONE/K-EXAONE-236B-A23B")
6
+ messages = [
7
+ {"role": "user", "content": "Who are you?"},
8
+ ]
9
+ pipe(messages)
10
+ ```
11
+
12
+ ERROR:
13
+ Traceback (most recent call last):
14
+ File "/tmp/.cache/uv/environments-v2/70e3935133a397ad/lib/python3.13/site-packages/transformers/models/auto/configuration_auto.py", line 1384, in from_pretrained
15
+ config_class = CONFIG_MAPPING[config_dict["model_type"]]
16
+ ~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^
17
+ File "/tmp/.cache/uv/environments-v2/70e3935133a397ad/lib/python3.13/site-packages/transformers/models/auto/configuration_auto.py", line 1087, in __getitem__
18
+ raise KeyError(key)
19
+ KeyError: 'exaone_moe'
20
+
21
+ During handling of the above exception, another exception occurred:
22
+
23
+ Traceback (most recent call last):
24
+ File "/tmp/LGAI-EXAONE_K-EXAONE-236B-A23B_0o7v97A.py", line 26, in <module>
25
+ pipe = pipeline("text-generation", model="LGAI-EXAONE/K-EXAONE-236B-A23B")
26
+ File "/tmp/.cache/uv/environments-v2/70e3935133a397ad/lib/python3.13/site-packages/transformers/pipelines/__init__.py", line 734, in pipeline
27
+ config = AutoConfig.from_pretrained(
28
+ model, _from_pipeline=task, code_revision=code_revision, **hub_kwargs, **model_kwargs
29
+ )
30
+ File "/tmp/.cache/uv/environments-v2/70e3935133a397ad/lib/python3.13/site-packages/transformers/models/auto/configuration_auto.py", line 1386, in from_pretrained
31
+ raise ValueError(
32
+ ...<8 lines>...
33
+ )
34
+ ValueError: The checkpoint you are trying to load has model type `exaone_moe` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
35
+
36
+ You can update Transformers with the command `pip install --upgrade transformers`. If this does not work, and the checkpoint is very new, then there may not be a release version that supports this model yet. In this case, you can get the most up-to-date code by installing Transformers from source with the command `pip install git+https://github.com/huggingface/transformers.git`