2026-04-15 16:22:36,752 - INFO - train_pipeline - Logging to ./output_checkpoints/graphcodebert-focal/training.log
2026-04-15 16:22:36,755 - INFO - train_pipeline - Loading model & tokenizer for 'microsoft/graphcodebert-base'
2026-04-15 16:22:39,187 - INFO - train_pipeline - Model placed on cuda
2026-04-15 16:22:39,191 - INFO - train_pipeline - Base model weights frozen – only classifier head will be trained.
2026-04-15 16:22:39,192 - INFO - train_pipeline - ===== Model Architecture =====
2026-04-15 16:22:39,196 - INFO - train_pipeline -
RobertaForSequenceClassification(
(roberta): RobertaModel(
(embeddings): RobertaEmbeddings(
(word_embeddings): Embedding(50265, 768, padding_idx=1)
(position_embeddings): Embedding(514, 768, padding_idx=1)
(token_type_embeddings): Embedding(1, 768)
(LayerNorm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(dropout): Dropout(p=0.2, inplace=False)
)
(encoder): RobertaEncoder(
(layer): ModuleList(
(0-11): 12 x RobertaLayer(
(attention): RobertaAttention(
(self): RobertaSdpaSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.2, inplace=False)
)
(output): RobertaSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(dropout): Dropout(p=0.2, inplace=False)
)
)
(intermediate): RobertaIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): RobertaOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
(dropout): Dropout(p=0.2, inplace=False)
)
)
)
)
)
(classifier): RobertaClassificationHead(
(dense): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.2, inplace=False)
(out_proj): Linear(in_features=768, out_features=2, bias=True)
)
)
2026-04-15 16:22:39,199 - INFO - train_pipeline - ===== Parameter Summary =====
2026-04-15 16:22:39,200 - INFO - train_pipeline - Total Parameters: 124,647,170
2026-04-15 16:22:39,201 - INFO - train_pipeline - Trainable Parameters: 592,130
2026-04-15 16:22:39,204 - INFO - train_pipeline - Non-trainable Parameters: 124,055,040
2026-04-15 16:22:39,205 - INFO - train_pipeline - ===== Tokenizer Summary =====
2026-04-15 16:22:39,227 - INFO - train_pipeline - Vocab size: 50265 | Special tokens: ['', '', '', '', '']
2026-04-15 16:22:39,230 - INFO - train_pipeline - ===== End of Architecture Log =====
2026-04-15 16:23:26,328 - INFO - train_pipeline - === Starting training ===