--- language: - code license: llama2 tags: - llama-2 - mlx pipeline_tag: text-generation widget: - text: "Source: system\n\n You are a helpful and honest code assistant \ \ Source: user\n\n Print a hello world in Python Source: assistant\nDestination:\ \ user\n" inference: parameters: max_new_tokens: 200 stop: - - --- # DamienDrash/CodeLlama-70B-Instruct This model was converted to MLX format from [`codellama/CodeLlama-70b-Instruct-hf`](). Refer to the [original model card](https://huggingface.co/codellama/CodeLlama-70b-Instruct-hf) for more details on the model. ## Use with mlx ```bash pip install mlx git clone https://github.com/ml-explore/mlx-examples.git cd mlx-examples/llms/hf_llm python generate.py --model mlx-community/DamienDrash/CodeLlama-70B-Instruct --prompt "My name is" ```