|
|
--- |
|
|
license: llama3.2 |
|
|
datasets: |
|
|
- UMCU/DutchMedicalText |
|
|
language: |
|
|
- nl |
|
|
base_model: |
|
|
- meta-llama/Llama-3.2-1B-Instruct |
|
|
tags: |
|
|
- medical |
|
|
- cardiology |
|
|
--- |
|
|
|
|
|
Llama-3.2-1B-Instruct, with domain adapted pretraining (DAPT), also called Continuous Pre-training (CPT) on a Dutch medical corpus, slightly biased towards cardiology. |
|
|
|
|
|
Training for one full epoch, with a 256 batch size, maximally 768 sequence length and a linear-cosine schedule (details follow..). |
|
|
|
|
|
This model will be further pre-trained on 5 million cardiology records from the UMCU. |
|
|
|
|
|
The perplexity was around 5 on the validation set. |
|
|
|
|
|
Note: this is not instruction tuned, and does not generate an EOS token. Update coming. |