Cognitapp-Med-Nano-v1

Cognitapp-Med-Nano-v1 is a specialized, lightweight medical large language model (LLM) developed by Cognitapp Labs. It is fine-tuned from the Qwen2.5-0.5B architecture to excel at ICD-10-CM Medical Billing and Clinical Extraction.

Key Features

  • Global & Regional Awareness: Optimized for both international clinical standards.
  • Efficiency: 0.5B parameters, designed for 100% offline use on mobile and desktop devices via MLX or llama.cpp.
  • Precision: Trained using prompt-masking to prioritize alphanumeric code accuracy over conversational filler.

How to use with MLX

from mlx_lm import load, generate

model, tokenizer = load("Cognitapp/Cognitapp-Med-Nano-v1")
prompt = "<system>You are the Cognitapp Global ICD-10 Assistant. Extract the primary ICD-10 code.</system> <user>Patient has 103F fever, body aches, and positive NS1 for Dengue.</user> <assistant>"
response = generate(model, tokenizer, prompt=prompt, max_tokens=10)
print(response)

Intended Use

This model is a supportive tool for medical professionals and billers. It is NOT a diagnostic tool.

Training Data

Fine-tuned on a balanced dataset of 1,200+ global and regional clinical scenarios including pediatrics, geriatrics, and infectious diseases.

Disclaimer

All outputs must be verified by a licensed healthcare professional. Cognitapp Labs is not responsible for any clinical or billing errors.

Downloads last month
60
Safetensors
Model size
0.5B params
Tensor type
BF16
·
MLX
Hardware compatibility
Log In to add your hardware

Quantized

Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Cognitapp/Cognitapp-Med-Nano-v1

Finetuned
(668)
this model
Quantizations
1 model