mRNABERT-patched / bert_layers.py

Commit History

add attention return + support eager attention or triton FA2 via config.use_flash_attn
e1354bd
verified

Taykhoom commited on

Duplicate from YYLY66/mRNABERT
62da139

Taykhoom YYLY66 commited on