use_cache set to false?
#4
by
pszemraj
- opened
hi, I noticed that in the config, it has "use_cache": false, - is there a reason this is better/needed for this model? It will make inference with transformers abysmally slow, and afaik, goes against standard configuration for text-only decoder models