Writer's Tower 9B

This is a creative, uncensored merge of pre-trained language models created using mergekit, and a custom method known as bcr. Quantizations made directly from FP32 safetensors, and then post-merge ablated with biprojected norm-preservation.

Component models:

  • DavidAU/Gemma-The-Writer-9B
  • Unbabel/Tower-Plus-9B

Process:

  • Stage 1: lm_head dedupe
  • Passthrough upscales to 14B & config.json patch
  • Stage 2: BCR Merge
  • Stage 3: Biprojected Norm-Preserved Ablation
  • Stage 4: Quantization
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Naphula-Archives/Writers-Tower-9B