File size: 120 Bytes
f22c128
 
 
 
 
 
1
2
3
4
5
6
7
dtype: torch.bfloat16
implementation: pythia
model_name: EleutherAI/pythia-14m
n_ctx: '2048'
n_heads: '4'
n_layers: '6'