| language: | |
| - en | |
| license: mit | |
| tags: | |
| - convAI | |
| - conversational | |
| - mlx | |
| license_link: https://huggingface.co/microsoft/phi-2/resolve/main/LICENSE | |
| widget: | |
| - text: Hello who are you? | |
| example_title: Identity | |
| - text: What can you do? | |
| example_title: Capabilities | |
| - text: Create a fastapi endpoint to retrieve the weather given a zip code. | |
| example_title: Coding | |
| pipeline_tag: text-generation | |
| model-index: | |
| - name: phi-2-super | |
| results: | |
| - task: | |
| type: text-generation | |
| name: Text Generation | |
| dataset: | |
| name: Instruction Following Eval | |
| type: wis-k/instruction-following-eval | |
| metrics: | |
| - type: acc | |
| value: 0.2717 | |
| name: prompt_level_loose_acc | |
| source: | |
| url: https://github.com/huggingface/lighteval | |
| name: LightEval | |
| # mlx-community/phi-2-super-4bit | |
| This model was converted to MLX format from [`abacaj/phi-2-super`](). | |
| Refer to the [original model card](https://huggingface.co/abacaj/phi-2-super) for more details on the model. | |
| ## Use with mlx | |
| ```bash | |
| pip install mlx-lm | |
| ``` | |
| ```python | |
| from mlx_lm import load, generate | |
| model, tokenizer = load("mlx-community/phi-2-super-4bit") | |
| response = generate(model, tokenizer, prompt="hello", verbose=True) | |
| ``` | |