|
|
--- |
|
|
license: other |
|
|
license_link: https://huggingface.co/microsoft/phi-4/resolve/main/LICENSE |
|
|
language: |
|
|
- en |
|
|
pipeline_tag: text-generation |
|
|
tags: |
|
|
- phi |
|
|
- nlp |
|
|
- math |
|
|
- code |
|
|
- chat |
|
|
- conversational |
|
|
- mlx |
|
|
inference: |
|
|
parameters: |
|
|
temperature: 0 |
|
|
widget: |
|
|
- messages: |
|
|
- role: user |
|
|
content: How should I explain the Internet? |
|
|
library_name: transformers |
|
|
base_model: microsoft/Phi-4 |
|
|
--- |
|
|
|
|
|
# decisionslab/Dlab-852-Mini-Preview-4-bit |
|
|
|
|
|
The Model [decisionslab/Dlab-852-Mini-Preview-4-bit](https://huggingface.co/decisionslab/Dlab-852-Mini-Preview-4-bit) was |
|
|
converted to MLX format from [microsoft/Phi-4](https://huggingface.co/microsoft/Phi-4) |
|
|
using mlx-lm version **0.21.5**. |
|
|
|
|
|
## Use with mlx |
|
|
|
|
|
```bash |
|
|
pip install mlx-lm |
|
|
``` |
|
|
|
|
|
```python |
|
|
from mlx_lm import load, generate |
|
|
|
|
|
model, tokenizer = load("decisionslab/Dlab-852-Mini-Preview-4-bit") |
|
|
|
|
|
prompt = "hello" |
|
|
|
|
|
if tokenizer.chat_template is not None: |
|
|
messages = [{"role": "user", "content": prompt}] |
|
|
prompt = tokenizer.apply_chat_template( |
|
|
messages, add_generation_prompt=True |
|
|
) |
|
|
|
|
|
response = generate(model, tokenizer, prompt=prompt, verbose=True) |
|
|
``` |
|
|
|
|
|
## License |
|
|
All content in this repository is proprietary and confidential. The software and any associated documentation are the exclusive property of Decisions Lab. Unauthorized copying, distribution, modification, or use via any medium is strictly prohibited. Use of this software requires explicit permission from Decisions Lab. |
|
|
|
|
|
© 2025 Decisions Lab. All rights reserved. |