File size: 852 Bytes
e4173fd 2bdfee9 e4173fd 2bdfee9 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 |
---
language:
- code
license: llama2
tags:
- llama-2
- mlx
pipeline_tag: text-generation
widget:
- text: "<s>Source: system\n\n You are a helpful and honest code assistant <step>\
\ Source: user\n\n Print a hello world in Python <step> Source: assistant\nDestination:\
\ user\n"
inference:
parameters:
max_new_tokens: 200
stop:
- </s>
- <step>
---
# DamienDrash/CodeLlama-70B-Instruct
This model was converted to MLX format from [`codellama/CodeLlama-70b-Instruct-hf`]().
Refer to the [original model card](https://huggingface.co/codellama/CodeLlama-70b-Instruct-hf) for more details on the model.
## Use with mlx
```bash
pip install mlx
git clone https://github.com/ml-explore/mlx-examples.git
cd mlx-examples/llms/hf_llm
python generate.py --model mlx-community/DamienDrash/CodeLlama-70B-Instruct --prompt "My name is"
```
|