Llama-3.2-1B-Instruct, with domain adapted pretraining (DAPT), also called Continuous Pre-training (CPT) on a Dutch medical corpus, slightly biased towards cardiology.

Training for one full epoch on the Dutch medical corpus, with a 256 batch size, maximally 768 sequence length and a linear-cosine schedule. This model is then further pre-trained on 5 million cardiology records from the UMCU mixed with a random selection of the Dutch medical corpus to avoid model collapse, trained on 1024 maximal sequence length and linear decay with warmup, for about 1 epoch.

The perplexity was around 4 on the validation set.

Downloads last month
4
Safetensors
Model size
1B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for UMCU/CardioLlama.nl_clinical

Finetuned
(1206)
this model
Quantizations
1 model

Dataset used to train UMCU/CardioLlama.nl_clinical