prince-canuma's picture
2fe417b0c66e863889de0fa57962a2f483408e8bd16c5bae43129330e75f5015
4cb1f0c verified
|
raw
history blame
922 Bytes
metadata
language:
  - en
  - fr
  - de
  - es
  - it
  - pt
  - zh
  - ja
  - ru
  - ko
license: other
tags:
  - mlx
license_name: mrl
license_link: https://mistral.ai/licenses/MRL-0.1.md
extra_gated_description: >-
  If you want to learn more about how we process your personal data, please read
  our <a href="https://mistral.ai/terms/">Privacy Policy</a>.

mlx-community/Mistral-Large-Instruct-2407-8bit

The Model mlx-community/Mistral-Large-Instruct-2407-8bit was converted to MLX format from mistralai/Mistral-Large-Instruct-2407 using mlx-lm version 0.16.1.

Use with mlx

pip install mlx-lm
from mlx_lm import load, generate

model, tokenizer = load("mlx-community/Mistral-Large-Instruct-2407-8bit")
response = generate(model, tokenizer, prompt="hello", verbose=True)