| language: | |
| - fr | |
| - en | |
| license: apache-2.0 | |
| tags: | |
| - medical | |
| - domain-adaptation | |
| - continual-pretraining | |
| - causal-lm | |
| - question-answering | |
| - evaluation | |
| datasets: | |
| - Dr-BERT/NACHOS | |
| base_model: BioMistral-7B | |
| model_type: causal-lm | |
| # BioMistral-7B-CPT (CPT) | |
| ## Model description | |
| This checkpoint is a **continual-pretrained (CPT)** version of **BioMistral-7B**, adapted on unlabeled french medical text to strengthen domain-specific representations for medical question answering. CPT is performed via **full-parameter training** on medical corpora (Dr-BERT/NACHOS). | |