UMCU commited on
Commit
c55e87a
·
verified ·
1 Parent(s): f29ace4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -1
README.md CHANGED
@@ -9,4 +9,10 @@ base_model:
9
  tags:
10
  - medical
11
  - cardiology
12
- ---
 
 
 
 
 
 
 
9
  tags:
10
  - medical
11
  - cardiology
12
+ ---
13
+
14
+ Llama-3.2-1B-Instruct, with domain adapted pretraining (DAPT), also called Continuous Pre-training (CPT) on a Dutch medical corpus.
15
+
16
+ Training for one full epoch, with a 256 batch size, maximally 768 sequence length and a linear-cosine schedule (details follow..).
17
+
18
+ This model will be further pre-trained on 5 million cardiology records from the UMCU.