"MMfreeLM-2.7B: hgrn_bit architecture not recognized by transformers (after source install)"
We're currently experiencing a compatibility issue when trying to load the ridger/MMfreeLM-2.7B model. The primary error is ValueError: The checkpoint you are trying to load has model type 'hgrn_bit' but Transformers does not recognize this architecture.
We've attempted several troubleshooting steps, including:
Upgrading transformers via pip install --upgrade transformers.
Installing transformers directly from the main branch of its GitHub repository (pip install git+https://github.com/huggingface/transformers.git).
Performing a complete reinstallation of all dependencies, including mmfreelm.
Despite these efforts, the hgrn_bit architecture remains unrecognized by the transformers library, preventing the model from loading correctly. We're seeking guidance on potential solutions, such as specific transformers versions required, any custom installation steps, or known workarounds for this architecture.
i certainly got it. The Trick is using the old dependencies that were used to compile the LLM. it worked but after a hustle


