Fix: vocab_size=32000 (BPE base from model.vocab); top-level weights; all compat fixes ab43fee verified suchirsalhan commited on 29 days ago
Fix: vocab_size=32768 from tokenizer; top-level weights; ZeroDivisionError guard; all_tied_weights_keys->dict 2f32d98 verified suchirsalhan commited on 30 days ago
Fix pico_decoder.py: __init__ defaults, ZeroDivisionError, all_tied_weights_keys 9b894b8 verified suchirsalhan commited on about 1 month ago
Add pico_decoder.py + auto_map config (main) f0f046e verified suchirsalhan commited on about 1 month ago
Fix config.json — inferred from step-18341 (vocab=32000, hidden=768, layers=14) a035e43 verified suchirsalhan commited on about 1 month ago