DNABERT2-patched / bert_layers.py

Commit History

add attention return + support eager attention or triton FA2 via config.use_flash_attn
f2409f7
verified

Taykhoom commited on

Duplicate from zhihan1996/DNABERT-2-117M
ea63a23

Taykhoom zhihan1996 commited on