|
|
--- |
|
|
license: apache-2.0 |
|
|
library_name: transformers |
|
|
tags: |
|
|
- code |
|
|
- mlx |
|
|
datasets: |
|
|
- codeparrot/github-code-clean |
|
|
- bigcode/starcoderdata |
|
|
- open-web-math/open-web-math |
|
|
- math-ai/StackMathQA |
|
|
metrics: |
|
|
- code_eval |
|
|
pipeline_tag: text-generation |
|
|
inference: false |
|
|
model-index: |
|
|
- name: granite-8b-code-base |
|
|
results: |
|
|
- task: |
|
|
type: text-generation |
|
|
dataset: |
|
|
name: MBPP |
|
|
type: mbpp |
|
|
metrics: |
|
|
- type: pass@1 |
|
|
value: 42.2 |
|
|
name: pass@1 |
|
|
- task: |
|
|
type: text-generation |
|
|
dataset: |
|
|
name: MBPP+ |
|
|
type: evalplus/mbppplus |
|
|
metrics: |
|
|
- type: pass@1 |
|
|
value: 49.6 |
|
|
name: pass@1 |
|
|
- task: |
|
|
type: text-generation |
|
|
dataset: |
|
|
name: HumanEvalSynthesis(Python) |
|
|
type: bigcode/humanevalpack |
|
|
metrics: |
|
|
- type: pass@1 |
|
|
value: 43.9 |
|
|
name: pass@1 |
|
|
- type: pass@1 |
|
|
value: 52.4 |
|
|
name: pass@1 |
|
|
- type: pass@1 |
|
|
value: 56.1 |
|
|
name: pass@1 |
|
|
- type: pass@1 |
|
|
value: 31.7 |
|
|
name: pass@1 |
|
|
- type: pass@1 |
|
|
value: 43.9 |
|
|
name: pass@1 |
|
|
- type: pass@1 |
|
|
value: 32.9 |
|
|
name: pass@1 |
|
|
- type: pass@1 |
|
|
value: 23.5 |
|
|
name: pass@1 |
|
|
- type: pass@1 |
|
|
value: 32.3 |
|
|
name: pass@1 |
|
|
- type: pass@1 |
|
|
value: 25 |
|
|
name: pass@1 |
|
|
- type: pass@1 |
|
|
value: 23.2 |
|
|
name: pass@1 |
|
|
- type: pass@1 |
|
|
value: 28 |
|
|
name: pass@1 |
|
|
- type: pass@1 |
|
|
value: 19.5 |
|
|
name: pass@1 |
|
|
- type: pass@1 |
|
|
value: 22.6 |
|
|
name: pass@1 |
|
|
- type: pass@1 |
|
|
value: 35.4 |
|
|
name: pass@1 |
|
|
- type: pass@1 |
|
|
value: 38.4 |
|
|
name: pass@1 |
|
|
- type: pass@1 |
|
|
value: 37.2 |
|
|
name: pass@1 |
|
|
- type: pass@1 |
|
|
value: 28.7 |
|
|
name: pass@1 |
|
|
- type: pass@1 |
|
|
value: 15.2 |
|
|
name: pass@1 |
|
|
base_model: |
|
|
- ibm-granite/granite-8b-code-base-4k |
|
|
--- |
|
|
|
|
|
# mlx-community/granite-8b-code-base-8bit |
|
|
|
|
|
The Model [mlx-community/granite-8b-code-base-8bit](https://huggingface.co/mlx-community/granite-8b-code-base-8bit) was converted to MLX format from [ibm-granite/granite-8b-code-base](https://huggingface.co/ibm-granite/granite-8b-code-base) using mlx-lm version **0.12.0**. |
|
|
|
|
|
## Use with mlx |
|
|
|
|
|
```bash |
|
|
pip install mlx-lm |
|
|
``` |
|
|
|
|
|
```python |
|
|
from mlx_lm import load, generate |
|
|
|
|
|
model, tokenizer = load("mlx-community/granite-8b-code-base-8bit") |
|
|
response = generate(model, tokenizer, prompt="hello", verbose=True) |
|
|
``` |