This model is finetuned based on RWKV4 world 7B english model, using wizard datset to fit in 32k ctx.

should do some complex instruction and can feed in 32k token, but as we know wizard dataset only have max 4K length per sample ,

i'll do more test on long prompt and generation

use https://github.com/josStorer/RWKV-Runner to run this model.

QQ截图20230729111155.png

image.png

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support