QRWKV7-7B-Instruct / README.md
nielsr's picture
nielsr HF Staff
Add model card
c98730e verified
|
raw
history blame
299 Bytes
metadata
license: apache-2.0
library_name: transformers
pipeline_tag: text-generation

This repository contains the model described in RADLADS: Rapid Attention Distillation to Linear Attention Decoders at Scale.

Github: https://github.com/recursal/RADLADS