File size: 496 Bytes
80f2e53 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 | ---
language:
- fr
- en
license: apache-2.0
tags:
- medical
- continual-pretraining
- instruction-tuning
- cpt+sft
- causal-lm
- question-answering
base_model: BioMistral-7B
model_type: causal-lm
---
# BioMistral-7B-CPT-SFT (CPT+SFT)
## Model description
This checkpoint applies **two-stage domain adaptation**:
1) **CPT** on unlabeled medical text to adapt representations on Dr-BERT/NACHOS corpus , then
2) **SFT** on labeled medical QA to optimize task performance (MedInjection-FR/ALL) .
|