OpenHermes2-13B-32K / README.md
Babsie's picture
Update README.md
be5309a verified
metadata
license: apache-2.0
base_model:
  - teknium/OpenHermes-13B
library_name: transformers
tags:
  - RoPE-scaling

OpenHermes 13B 40K

It's a bit language drunk at the moment, I'm going to sober it up soon. As a dyslexic, I know kin when I see it 🤡 It certainly won't help you with your spelling! I'll be fine-tuning back up again soon, so it's ickle AI flamigo legs won't bow and wobble like it has to pee anymore. Or in nerdo corperate speak Context window extended to 32K via linear RoPE scaling. Requires post-RoPE stabilization finetune for coherence.