Boreas-24B-v1.2 / README.md
Naphula's picture
Update README.md
1f16fe6 verified
metadata
license: apache-2.0
base_model:
  - arcee-ai/Arcee-Blitz
  - ArliAI/Mistral-Small-24B-ArliAI-RPMax-v1.4
  - dphn/Dolphin-Mistral-24B-Venice-Edition
  - Mawdistical/Mawdistic-NightLife-24b
  - mistralai/Mistral-Small-24B-Instruct-2501
  - Nohobby/MS3-Tantum-24B-v0.1
  - PocketDoc/Dans-DangerousWinds-V1.1.1-24b
  - PocketDoc/Dans-PersonalityEngine-V1.2.0-24b
  - ReadyArt/Broken-Tutu-24B-Transgression-v2.0
  - SicariusSicariiStuff/Redemption_Wind_24B
  - spacewars123/Space-Wars-24B-v1.00a
  - TheDrummer/Cydonia-24B-v2
  - trashpanda-org/MS-24B-Instruct-Mullein-v0
  - TroyDoesAI/BlackSheep-24B
  - Undi95/MistralThinker-v1.1
language:
  - en
library_name: transformers
tags:
  - creative
  - creative writing
  - fiction writing
  - plot generation
  - sub-plot generation
  - fiction writing
  - story generation
  - scene continue
  - storytelling
  - fiction story
  - science fiction
  - romance
  - all genres
  - story
  - writing
  - vivid prosing
  - vivid writing
  - fiction
  - roleplaying
  - float32
  - swearing
  - mistral
  - rp
  - horror
  - gemma
  - merge
  - mergekit
widget:
  - text: Boreas-24B-v1.2
    output:
      url: >-
        https://cdn-uploads.huggingface.co/production/uploads/68e840caa318194c44ec2a04/fcjkZcNgGo77Pd8wHKuvt.png
new_version: Naphula/Boreas-24B-v1.3

Boreas 1.2 - Radioactive Edition

The same components as v1.1 but uses the FLUX_v5 method from v1.0

20 hour FLUX merge using 1000 iterations to find the perfect center

20h

v1.2 is a sub-component of v1.3 but seems to functional very well on its own so I am releasing it seperately. It has none of the bugs associated with v1.

Compared to the RSCE method, all models were within 1% of each other, with Mullein having the highest magnitude at 8%, the rest closer to 7%.

Boreas-24B-v1