UMCU commited on
Commit
a389e24
·
verified ·
1 Parent(s): c55e87a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -15,4 +15,6 @@ Llama-3.2-1B-Instruct, with domain adapted pretraining (DAPT), also called Conti
15
 
16
  Training for one full epoch, with a 256 batch size, maximally 768 sequence length and a linear-cosine schedule (details follow..).
17
 
18
- This model will be further pre-trained on 5 million cardiology records from the UMCU.
 
 
 
15
 
16
  Training for one full epoch, with a 256 batch size, maximally 768 sequence length and a linear-cosine schedule (details follow..).
17
 
18
+ This model will be further pre-trained on 5 million cardiology records from the UMCU.
19
+
20
+ The perplexity was around 5 on the validation set.