Instructions to use HuggingFaceTB/SmolLM3-3B-checkpoints with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use HuggingFaceTB/SmolLM3-3B-checkpoints with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("HuggingFaceTB/SmolLM3-3B-checkpoints", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Can someone please explain what is short SFT step ?
#9 opened 6 months ago
by
gshasiri
short sft config
3
#8 opened 6 months ago
by
zhengwenzhen
Model Merging Mechanism?
#7 opened 7 months ago
by
ojus1
Main branch
#6 opened 8 months ago
by
aldakata
will open the "short sft" model?
2
#5 opened 9 months ago
by
leo98xh