Instructions to use debisoft/mistral-7b-thinking-function_calling-V0 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use debisoft/mistral-7b-thinking-function_calling-V0 with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("debisoft/mistral-7b-thinking-function_calling-V0", dtype="auto") - Notebooks
- Google Colab
- Kaggle
| { | |
| "</think>": 32772, | |
| "</tool_call>": 32774, | |
| "</tool_reponse>": 32776, | |
| "</tools>": 32770, | |
| "<eos>": 32777, | |
| "<pad>": 32768, | |
| "<think>": 32771, | |
| "<tool_call>": 32773, | |
| "<tool_reponse>": 32775, | |
| "<tools>": 32769 | |
| } | |