Update modeling_chatglm.py for transformers 4.49 compatibility

#89

After updating transformers to version 4.49, the following error occurs: AttributeError: 'DynamicCache' object has no attribute 'get_max_length'.

sylwia-kuros changed pull request title from Update modeling_chatglm.py to Update modeling_chatglm.py for transformers 4.49 compatibility
ZHANGYUXUAN-zR changed pull request status to merged

If you're looking for an easy way to access this model via API, you can use Crazyrouter — it provides an OpenAI-compatible endpoint for 600+ models including this one. Just pip install openai and change the base URL.

Sign up or log in to comment