ssdataanalysis's picture
Add files using upload-large-folder tool
5f49c8a verified
metadata
license: apache-2.0
license_name: jamba-open-model-license
license_link: https://www.ai21.com/licenses/jamba-open-model-license
pipeline_tag: text-generation
library_name: mlx
tags:
  - mlx
  - safetensors
  - jamba
  - text-generation
  - reasoning
base_model: ai21labs/AI21-Jamba-Reasoning-3B

AI21-Jamba-Reasoning-3B MLX

This repository contains a public MLX safetensors export of ai21labs/AI21-Jamba-Reasoning-3B for Apple Silicon workflows with mlx-lm.

Model Details

  • Base model: ai21labs/AI21-Jamba-Reasoning-3B
  • Format: MLX safetensors
  • Quantization: none
  • Intended use: local text generation and chat on MLX-compatible Apple devices

Quick Start

Install the runtime:

pip install -U mlx-lm

Run a one-shot generation:

mlx_lm.generate --model ssdataanalysis/AI21-Jamba-Reasoning-3B-mlx-fp16 --prompt "Explain why the sky is blue."

Start an interactive chat:

mlx_lm.chat --model ssdataanalysis/AI21-Jamba-Reasoning-3B-mlx-fp16

Run the HTTP server:

mlx_lm.server --model ssdataanalysis/AI21-Jamba-Reasoning-3B-mlx-fp16 --host 127.0.0.1 --port 8080

You can replace the model ID above with a local path if you have already downloaded the repository.

Notes

  • This is an MLX export intended for mlx-lm.
  • The upstream model license remains Apache-2.0 with the Jamba open model license details linked above.
  • For the original source checkpoint and upstream documentation, see ai21labs/AI21-Jamba-Reasoning-3B.