Nanbeige4.1-3B-MNN / README.md
DeProgrammer's picture
Clarify settings, quantization level in readme
c975527 verified
---
license: apache-2.0
language:
- en
base_model: Nanbeige/Nanbeige4.1-3B
base_model_relation: quantized
pipeline_tag: text-generation
library_name: mnn
tags:
- code
- mnn
---
This model [DeProgrammer/Nanbeige4.1-3B-MNN](https://huggingface.co/DeProgrammer/Nanbeige4.1-3B-MNN) was
converted to MNN format from [Nanbeige/Nanbeige4.1-3B](https://huggingface.co/Nanbeige/Nanbeige4.1-3B)
using [llmexport.py](https://github.com/alibaba/MNN/issues/4153#issuecomment-3866182869) in [MNN version **3.4.0**](https://github.com/alibaba/MNN/commit/a874b302f094599e2838a9186e5ce2cf6a81a7a7) with default settings (4-bit quantization).
Inference can be run via MNN, e.g., MNN Chat on Android.