Finetuned "standalone" gemma TE's uncensored fp8

#47
by Sikaworld1990 - opened

Hi folks, I am sharing my finetuned uncensored gemma TE's fp8. Upon Kijais "dtype" fp8/bf16 fix they are working better than the regular TE's for me.
As for the differences please read the model cards.
https://huggingface.co/Sikaworld1990/gemma3-12B-hereticx-sikaworld-ltx-2

https://huggingface.co/Sikaworld1990/gemma-3-12b-it-abliterated-sikaworld-high-fidelity-edition-Ltx-2

Btw, there is a newer/refined abliteration method called Projected Abliteration https://huggingface.co/blog/grimjim/projected-abliteration

For example https://huggingface.co/grimjim/gemma-3-12b-it-projection-abliterated

Heretic seems to have projected abliteration and Norm-Preserving Biprojected Abliteration (renamed to MPOA) PR waiting to be merged (soon, according to p-e-w) 😯 https://github.com/p-e-w/heretic/pull/52

Btw, there is a newer/refined abliteration method called Projected Abliteration https://huggingface.co/blog/grimjim/projected-abliteration

For example https://huggingface.co/grimjim/gemma-3-12b-it-projection-abliterated

Thx I will check it out asap and mightt be back with an fp8 version

Magnitude-Preserving Orthogonal Ablation (MPOA), previously named Norm-Preserving Biprojected Abliteration, seems to be even newer than Projected Abliteration https://huggingface.co/posts/grimjim/803126534676334

For example https://huggingface.co/grimjim/gemma-3-12b-it-norm-preserved-biprojected-abliterated

Btw, if you're doing your own abliterated version of gemma-3-12b using Heretic, you might want to wait for the MPOA pull request to be merged first.

Magnitude-Preserving Orthogonal Ablation (MPOA), previously named Norm-Preserving Biprojected Abliteration, seems to be even newer than Projected Abliteration https://huggingface.co/posts/grimjim/803126534676334

For example https://huggingface.co/grimjim/gemma-3-12b-it-norm-preserved-biprojected-abliterated

Btw, if you're doing your own abliterated version of gemma-3-12b using Heretic, you might want to wait for the MPOA pull request to be merged first.

wowww if the benchmarks are true this model will surpass any other gemma al. version! Just started downloading

https://huggingface.co/Sikaworld1990/gemma3-12B-hereticx-sikaworld-ltx-2 this works with GGUF versions of ltx-2?

because every time i start a workflow in the console happens this clip missing: ['multi_modal_projector.mm_input_projection_weight', etc, etc, a long list

Sign up or log in to comment