Instructions to use rparkr/LFM2.5-1.2B-Instruct-Coding with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use rparkr/LFM2.5-1.2B-Instruct-Coding with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("rparkr/LFM2.5-1.2B-Instruct-Coding", dtype="auto") - PEFT
How to use rparkr/LFM2.5-1.2B-Instruct-Coding with PEFT:
Task type is invalid.
- Notebooks
- Google Colab
- Kaggle
- Xet hash:
- c9977af23dddfd66055a9100d4aca3abea29d4248964326900b0239db92e1480
- Size of remote file:
- 7.25 kB
- SHA256:
- a72d32b9515445206113efce80e9bcec146eab07309b07b7cdbe7dc562877f4f
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.