Datacenter Models
Collection
Lmao • 2 items • Updated
A finetune of GLM-4.5 air to improve prose and writitquality and attempt to remove the bulk of glm-isms using a Gutenberg-like methodology.
No particular attempt was made to preverse thinking ability; I recommend skipping thinking as in the GLM template i.e. using </think> as a prefill.
Trained on a variety of backtranslated critically acclaimed short story anthologies for 8 epochs.